var/home/core/zuul-output/0000755000175000017500000000000015144706536014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015144716167015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000273630015144716007020266 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB P"mv?_eGbuuțx{w7ݭ7֫d% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?lm$K/$s_. WM]̍"W%`lO2-"ew@E=! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMme_e1NHr~no\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__4pσw=͠qj@o5iX0v\fk= ;H J~,,r%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D5| 01 S?uq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4{7/KwΦθW'?~=x0_>9Hhs%y{#iUI[Gzϸx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjkٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cguO/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX^$aj.m.6dHҝdVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1YןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳o I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.M~X+EǬ.ťqpNZܗÅxjsD|[,?_4EqgMƒK6f~oFXJRF>i XʽAQGwG% C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$?D)~?wy,u'u()!d}uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|G)Q`rkKyt1?[ˋZ5NhfӛŮ Qu8Y4?˽K?2_~3\z=y}^D.ݮl`p48io^.š{_f>O)?J=iwӑ؇n-i3,1׿5'od gP\|bЯ݃5H+޹na4p9/B@Dvܫs;/f֚Znϻ-|X!lk҃=pnUגZ6p| G;;74^l{Pclwů Հ}xcSu)6fbM/R(*ȴd.^Qw %"=nluOeH=t) Hİd/D!-Ɩ:;v8`vU~Ʉ!hX #'$2j1ܒZ˜bK@*`*#QA 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKO?mdg#_/-iFNviڜn8E0FȒ?$;vnjbṙ̐ I쭂dL4Q4qg'=% z7MQb_$[wA.Ma#W(d#Ky*mߔ4UQJr2OOHao_e>o6۳SyOyH"0o">{\{k)*"tV3ľuP{!Uno*剨l=ii~WuG~|BL BㅱOB/G3y9{yB=NCy8 b(ɮU@wKʰp^h^؟/bȾ0 3__]@%͗ŕe8s󲸲MGqlvz,eH1eqixюuaa9VBC-BNHWfr@nׅ6W"kȊgE5quۋl}P;U{eY;d0*0/_nZ 1bX'` <}s#; |Ct{I^{IK܀y&Azcz4|A:V%'~;Mm³mϑKӇea ^>ħ.hō,4BׄξDA`ợx:d5]k_j-x*7j ɹ횯;?MB7/L~w1 yT4GAQ `Aa1`7*n&sl9?Bh??dOrmV_>``yc8~d6Mr,H ݕ^9@jC%y6s]x)2~vB \%WٗvęKZ⨚ؑ3mL weMn۩KiDD E5MTb\f0; ŶКwqQrBO;-"u%2ɐ&cܲCxK'r=3 e{ddHY=-Qv]8d(.G-9d"Jb"L8m@U!GTdHz2x`@K"#M[\j2 ,E|)Q(s`JT2|5}\bUYV' >9B.b1fr%PX2As}eb|GRfU'wt7̂rR4c[*dehc1S, RAZz&J.{XB~Y[f&p`*a~)8IPg (EuUx[os45@* OhhXx.a\=`Nߟ<̓%_K~v=j o-o'g lD7>7 Q{`<^)%h NQĭ G4ciHl)g\ig UOϻy@mˆ#Wiڬ'Fף0Q"-8\4(qGg''؞8I(@,Qfi iԔǴZDȳu+CkmJ>ۏ&,LakǬN/i~ L y.VS)< ޘR 7"s [ۏ Ŕ$^<`@?d<OC0^B4ps)C|d!1kk̻{TDuReu4} "uџ$(Pa$'!X!q+PˢQS$ Ɲ)MA`qP^/s"ĠkA Aquoc1MK#&ۻ5YdOl `gġGG j:VEs+&ag qSAZƭoMy|a|Cn4OU~=)̄MU%rx5t@q$c4`/=`H,R|:y{L`o螤BhɻyooR1 U,9cGbYZfnjB0fm\Yj>>\4[%j..`YkwY"/gM=w^}pJu{e Prlx0i'1-둦0D@7PRPKJZ"x4 4fJ E,+L=L` GDTDaJkÛw0^'"SV-+9fBjU!2nx'g\fu;TUmb'-UzSF VWycTdzo `l9Y/@әV`βע} ']FEwY6HVlSIEUɭ"R} vV` Li8^j_e~pyNO\lǣԖEɈf!yTfa^fcZJYkx2EW7Z\fI743,5ma+ M&Vpz0T}="X/0UVK0C4u5J^WRgd0]*2oi+w_lg,[ M¥ DC0QOHV ъ:-Bk %'?4&Sō gI| dC۽q4ۊpslk@C:: 1DV3H3fLx(MLq1{Y"hcM&dzSmxذ1c60zmjɗx.^Y"wjZW_=c >ʋ8<,fAF!W%)hes91HC1t<\w--Q!Y6w4D`<ϣT^au;=D֮[_[4tnA*LgצliL%e4D%zTrU˒Ee ws;4ѵt+R"1ӏn{qx0, IڞVsV}X%;n8zO WjQͬRA6ϗrw,%"w :IH9p^[op,n{?z\[6ne}oiج~7C 筃1qPy6Rl> X`G :sK@.FSg<*iSwT(a}`;U;ScvXeW5!`!o%wEՒz >sڙ0>֎Fv;ׅnFnƎ.Tn( )ǂlX*-gWB': otqm=}%nh︭;߿ t'; ?객'MH|Ψ;ҍSy}ծ&A aW> ;:0}8:}<;/yR̶f.Onc%6rd0Qx6W/kk{./6Xor lgw5uȠ\6.jvB8&gT~/%(榜&)qq=r /n=Jg-xUrDKmx@-OPOxR#>a:N3" 'H }fxv? IVE{;R4Ng ./ģ.чIN &u5>qu52OBRqzE] u=ڳpx u}/~N58C=ޏo3@WT':Fqk4e4c^Kk5u:pyTB^Էݗ>}bFRȠA}-vVjt_C}Q7Q׊Z.\]Ci@~@26 ;Yu;{i5Qll^ + 1&)ryHW՝2%st*N" pVkMwͲ(6CA͌ҚY\ d+2x^FŽDǢ-GZB#T 8K deÛV`Y $u0O@.JFet w\SfuxŘ"U^MݰkQ8 +hI5G*hJ Ev3MY fsM[NYxjE\F$:Ԥxȝв3~_I1ՊZ,2P\V@,F9q5F%G Dz`/2GPE<\ѺZ .÷G_"ߦsXaJE ~2b:d'ZvK1V0U u;݂on-o^rfW>iN +að=^k}'\M3Yu);"V1 lQWhnT$Ag8`B; sHS"Qd-!u,uU*Ժ T B aGu.l.t[m 0W^ u8ltR˘=>qTJwׇ3O۽wemjIJieO#I)ہPMO#ޓP{ByP|wB{w i@;u$فPMO#ݓPwB6z=PoOBۄ;N4B= w 4h@h; $4@.ql\`v",+=@寳vFCk?6Q./ԧ],dx*n`80qy:lq/$8w );ŌIz#LxaҵHc@GK*?$~AK٤P>Lμc |p绗.ECY?w oy)5-Q9|yӠh0?O'Gڃ38'[yy/l"-x)P 2y(\RnpufsFE x2hQ\QleaI!sQ% %hq}*Qek6@Oi-L+X*ҫ }}IV1Pt`QAw+WBx˥b#1dz }E,:(f3S _Yb]ś24PBL#cµ5kCyIy?)W!?*Pdy)a' #8TšԧJ6 b~é-9{`PLa|ŔCSEuu:m 31`@LQ ÄYÃbN[0Ӻ:.j+^'༓":K>3UHJdEa=gBXv:J„"O, \l+|Ǐr8Y X ~@Kd'_tOp,$穄t'A,O?ר~sҁcY,اz(- yf} (Z<~۰Z,6j>@<ӞYN#s^9y S-ެn:J[9{gdpV A=(eReX*B  Tă_)jUd 8l#WYJэ^"ׂvۼ3e =`^PǾ7A׃+hC,yUb]$ƥ%o62 Qbx2.3-8NmT$ئ*pRc-,>"\᜞9D_j&] r*ջ(ߪ\uS#Vi+-#v|`@mEe.-7[fHg2Hu9@k]y ,2Zbk@ ,%U/B]1&rEGP.u:xXl8ʢF֖6 ?̙cej]uw"TDڌ􁱜ewKMn+[+f@tjQcOP_G.}縼w '[ZY0ޞ&cMUA&\G[Gءe?1)3-J,r7t^GI7ʇҺHa!RΊ4|| 'IcW KC ˷b P-*Nտ*7{+ q(8^˒!CP˙:kKRPv)^#n-A3:|6x-+^dԂEŸ킬ɽ9$O=Sl|׭x+nJx&i!r'h#cXXu׍MEe^gVJjUR)Q5^*h 3ՉdCdHhvyd3A2fMn6)KM{9]2tͽ&¡Ng~ WY^RGg E-ע{M(l6i%~W85MS\J,*N4&~ޒ 02y-fDdl 濧 {C0lYWWгtîiihq4wu/KyW5zMtdMԖVi=w^e]!$kO??a~x6~x]iXÏ߯m6yǾk}T~*oiӗٮIu6_q_.7~? AS1Z^oخ2Iy??=ܯnzcZ72'7ssW ݼWW{4?v4WeǮeғhq~--Lw8^xÔ3B=olKX#󻰾m$Z M1¶GmX;&ͤO"A0`#Pb蓄\‘@[.YXgSe-7D3ƃ'\ˌڱDSt;tZ:)F "Yo@K vz)`8/?u)i5Gy%QY;r ]RE3] ct7] }[E 16g)>J0E:^h}mv:׬ G3gHZ7Ms?;hݲ6)<~[҉y|Hkx6Wњ. Kh},[Vĩ[c ]TwđѤ][G1Q3\9B%̴{ӝ"ǻkܥZe4:}jrU r8"k889s6MOi|Gu]*8/A T_3:iC wbwT@pI}Ϊ ]T OZ<_u*:.#^Dԥ8[ >F e%XcR̈I z0kULy<Ц `@F\˚y t FgA Cȃ\X-" R:5Y[,#pk}e0 JWBZx@O鐉ݰI'Wik;ض冥ay=%o-Ц@rS2&O&HyfpQ\ 4W +ƣ*>5dE)%z那U\SJ¯q\>Wץira\ F5[K/<# Y.͒?o瘿(=xaƑCY{(h/!eHAiHVQvs$ ^.`lK~btu3UVu3t>bO7EiY>tq0[ݹ)}fX|.}9&s#$d}ufbv'9Ǻ5@ԳtWCd!ԎqNZ'9PUKsTJ˴YsurNd7棇x+$(EA؂KlUY*Μ:YꔽYVk#F.27fwNhȾwA {-־ʆ$d,Yb?Xj2Ow:ƓTl7Zjl{M0Q-q&jeUQU]ŗU0?@WI`8<+J1}%1ק_ fSFp y&XKp\)~uNyY=|&0 !9 Ƿ2] ~x\鸉 $#ΛHKDmh;^պ|sˬE7Yg0fNh~7,:lFK4wL7mdC:7=a#cL:cEU?( GKdIty2Px8ak"ƵC%h$lAG|J[,p@.N>gl(ep (]*W]y-]jY!^۳necdQlxmhz<d6:X2J>?@[7|xɷŏoĪޛ<^aO"7x,cA[TBOPHV=tfrE5hWڵZG`}KbDaG3bӿ_X@7`]cQˈ_0n]GZjRH1XS<7P|BSQ73omJD7ֻ WTMâF>3Nnd@ qu]*B A+?c2⊜R puq/ <{wUza"^v4+R4<񁇙hډ#VޟV`]ȿVN@;gtDa[Aw珜i$pM#Ft;"?Lq?M!jG=Q+Rbr.=wYUzP![J%WnV|ģEX!=42+ϨC"M+5y$4P_]nFQ'jB`j#l{, /iJViJ) -wqP䘝Jy߹oRGo.۶ jl9j? חp}ӘTbTQi[Wf H꺗Fډ\x94I$^9AuIG(l9viͩb p&)h,)QMWb}W_l,JLA#i ;?m,dxNKsyvVCJKWRt̍`k*b&)c^~L+y3H(+Ò,O"H=DU䗟|gy{x. =zX \0Iw?ɪΉ6¢1(Ƿ?|ӫM5J*N3ꭜ|ۉ|3YmL |4{e@Xo[4|iŧ÷/^j8sAS̞3h笃f H45o߮9䪔:@6Mno@**> T4#~f)\y[%Ꭶ$O3})+nGP*b {m."S+sW&] qoFÏ-80=ݤ83zkg|2w|) ={:6ӆ` ~L_n&zg!w-=i֫pyEK&ܙ4 7}Q_jr\>**அ?u/)cJj^>P%ٛI L`qٙĕLZ˒ ~޳3dMh3_gg6 uvٙxq0vbPGgg6yoS߻>MP ^S2b|}).K[9wyn"|;>_/MKNLgcGM$7wnQ `d U;C' t.U|~|qӊ?DMs Y=W4Va$׈s YnV!R '$SF`bB`9AIkT吙-+ ; kE/|w~/71ҡ'>AXbr μ4yE-EFbce3  MʰOqS┪fiT9j㞃޹-iCnbWo_MP ցMzo3^Jv|s*kUO-h5Ձ@0bMx&^#$|8Rj%`F$kP-Z*trNzbXoQxS%q1p\fjS+ |[7]"v*˸ZcīT딥I3FR$C}W.R_$urf%J[cfrA3L5 cz(K^Scv/li1EIyyce7pJck9B!XNMELj{ n\½!!e8A!FzoUFh8ْc^tk4ƢqAj&[ۤڶ65ko#H>F L6-!(E`耏h[ViJ~-|z} 5-T*]/ҸWVV2M c<2x, +L~̃1HkVŚpK~[EJc^y([B_iB\iV&ŁQ62Ĵ_h}o4jcM$" |n eK\*5Ч7Jj!A,5Ep!|0LE43SR4Lɟ H#YLpC{$`7$Y,p`GE=ijUMJd!Eglv׫:v+-j- 9'#N5 C/1|oJ'q)p<%%QJBIaӜ|S&\j0fI29ƍ >4U:215 WұG{^,FhZ+#q1IZ:,P:f]Ju:~/͌Li  q1N,%0i !rZXʹKv"T"J48VHp"%8φ&圲p-Qp`Ȉ%I؋vٓ؋$&uJZ+N)zdׁjbq#1g/Hmy6=ZW4:DOnaP hB)ӿWcWMjl-##zN (2*!fwx3M?cȀcJ"LDI3QVc"4)s%<B_kǨ3~B'#NCPz!1ہ+NZe5[{ 2Iij]2i8I+呠Q(@0uI5inJw7c": EPˋC^RD ~̧>P(fzVu;lEW3*+*+p e[;#hj.vIv5X#Pnߛ.%8bɍƳ&n`$b$qPyT 0ڵgmf P L^G5"M,[F4Y̥1# +RjN:.sHѭDfR;jLUh£(Y$l̬dfUƅt,F3I-5}ZpRi':pNh .(:IS%Z /edK$q),}V)Afm%YہOZJ{& gH9=Z'5GUWJ~G~jPڕGgV?AP:"|$U%v:fV#FB·TsYB6KS*,14\EHbXFSYDHE@s"sƄXqN"+V*MDP!cNqn,MeґTT5Z{g~ `%N\ub @ZwP,7}u=Yyk4J2qVҪIOfjA \T;bFسxT4lczZھhm% Ӱ#LBw vƛ 0 Qvnz4ǛY+>i:ݎ䄚38vi:!\;p7@˶ܱv=iA0/i Xղ/ $k} ӈ;LɃ6f>eѫlU~e QPk,_~^Ĵ͎n>,5O-PeCJԗ 0le*sK^Dzŗ2,ugtY.\`OAy/s+!c!sBW)~2o0v+c`SŶmQ!DԞ#!CaSƄProN85 mN[l5ځOl7(-9jI-6Hf?4=V?-?cG>nx aWGِ,`IĮ9\rؕ6`jN ` ?XZk=j뷴 .?,?vћ֔4X,M=,!!F[v,OfofOIXG+M&Xc 0^YPooAEUqXqXzBb$|nr6Et~ܳCi2{L(۹A^MLe噚xMgk;ciB$\޳CFPbuձ}=NাO\ _?X>to0xMyUA">_?ehkX*BZi .1M EgU(H"n2  rn>: Ë`^,x@2Ε]0+&ӣ߼}ݼ7%>5ڌ|:2SՆ5[4(eDZOW(Zi54Qj$؏;ZZ^6F5\ fGA}vD-Ьi6`̋N8 XX]TJϮg^yulP5,Eৼ z4A|:ծ]ܸI8EV}y fl%_௞ZPsڛwF n9JX.PYl0/?ZT RN@d |3d_G1]|VO)/.m 45uQ+xۇqQ\ _TԢtet˷j]oFwR?pfgoj6H]E? D+, 7)` vVfs"f%Y 鹩~Owv90E~qmˋ&?R?`+V)''n?FUQz;*P z4i0kT-|X=A*9L0 xJH|:=0+x+]1>!hSca<*g 83hYۺ^w]Q à4piaz4*_jc÷R`$+Vh >Σ!(j)jؖ6g8ΓhF]iGG#Mm>s,R*wRCbWp07/*y@m;_}R~{$hAoU|H2\i.?tw 5B1 yf 2X,v.AfCR x3*ϻUCR- <o)%U O]U~4n&nݬn>ln;ⷖl\`c$jP0uS 22FQ(;ULs d9Q>R[+0'O_j$'΋l .;mR h-Lۣ{*+F鳟^iqvK{3\ڟ%8;Xv50J r+c00ROŵ` ~R=W$f(@*Ss9i` " VKIOہNi1ہRd]3WQt &䨘_F1onq/1b CK d_RRXB;Gk}67(NVղS-`Bz|H"<_ 7HӷU 4 C AZz<ɼ/(Vh'EAeʯ `4μYK:dk@?c81t.nFa.YS4:u, 6rE5 a}Cy}> WL79T9&ְ ;JLN`Yv07["~e9:0`y%E>_/6.K}R]|`;ׯaVIԚٗ|9LG$g\pfԲ-z/h#xix?xdSEG|b<7O &δMB)g6nuh2S342"KuY]of7JJO)ΦwSl4Ho&i]ܰ*Y,3޺O=HANn9.B% >v"r\|NHpO̮i/]K@~YC5c,j{0O8 ,~~''{'s#>pLJ8%A WQPvFoGh=!;.\d<޼jm>MAeqwn>s Py:jS=ð hg}d2Bi/Q<,A&4u&-KcE挄8U,d(K&$`'Uw=җK鵩40c4Z01dd'&ڑL=&2sȈ,-0]8,tQZ 6DJJvqYNI`²Rq!/7ɇ?ek*U^Ǥ7+q?nӧإd 3~;ݗ.'93IR:7e%*Wmt2?uyrދ poq1Qč4!"N%ьN %"&i"̨ʘԱ(6,_ x`cb\V9g0-[FQSO3x2㍫I>ޣ]3& R\ \:^<&Kcyُa _KN6>X) CWϊr_HuWIKy/-+؟u.dÑ^"M^e"'t8Oѽ{_?^ Z*{ <K^ް,.+ۡ^@*(J8`_nbxtyq `Jޕ57$ұO麏}^1Ov杨S:ǯw'YY`VU֗Yy4Z7T?zGj()B j@j D3 T"?R۝Pڋa"%?RcFcJBKk1\Y+}S2(xȯH'RmnMO=G#K9uF z/d)^_o>?MUUf  ֿbM1ÛɧWSj]M)v5Ůbף}5Ůb#bVI:ܫ6"Sj6R=ezu l9t`m9|x7& _-j\H@S!>Y0 $.F,Un<ߵVCb+m;U|*uHe 7oɡӒQ^FgZ0|oБ0ӎ7 PL(ŸG|N(RW$E-FC0?n\}j}FWyGo_w^U/aݎaG V~ʿ8s5 ɮi״wwM]$J}gkyވ SW@ \׎$F+}dϗ-WG?f_so6SYg),LdJ;1嚒 ~n OZ&4$m!$ ET$h)&XҐ&4N7乡O,Qg&'p[8xGKtZA { v`B}n@@w-*?Ue fܽE`DQ = ,VĨys`5Gw pjb\w6$GKGBR`ަ-Ċ+Xpba{RpSmp3]Df"<1,ƑI}kⱪygA@^"hKpX֎̱9GC8"Wt8q>ND6z68U*D{M,OҪR r ?,EW܎rS./Sj: Z4xlrצRiMOw̶+[g"N:Js 3!`y_9J$È.3M#q:QBr]hQ{jEWΌvs~VSG4@20A aJKQ'2N {\33S(+bVdZ2iL:J=Ĕ jGWR )Ԫ<@<,`T% W A")X2'ki""zm~"IA `IIi%+t 򭥋ڔ.tMoUZm2ǔMzz ( z})q#s4zT:PUL3:z'ڭtʶQ7as#UO_ç3$_ϫ?O9nc$&E`'J/~x~6m/O? 3lG(֖YɠrhڃD!5)=ɜltqTt[{+3yVָ|CvPmj_#Y1?,ߵÛ DNl%X:M)#V*dlhvBnj)y_-冃}uK6 y!9W:18i03!WyO:>|ϫ[dmx R^K]gOBc.IMȎ"+l=ٳ,/ZBq͒H X隑 %" ([GT 5#*P}E;d|{>ν;a&^& :T◯Oxw*YW,Tlk̈́QI@h y_)9NNxቦ",} %3+;'wqِ-l2DJkHBy+riMG,tـaH 'HvTJ a!(FZT(2rtQҝɰ#ոqLaI4PjΖ8g8^eQ~nM")rNŐ< 66ʴ;Ɲ|8tD~g^eOajU3X,^!**\${G2<RB;cx4D#+r&"0&Bm$AFY]ηA:Ѓu y_o7tI^;))8kAgia\je1<"Ǒ`E|3?:#2DŽCtH}ns㢋6 Z0B"ն:i+.)L*Rz6CLZEWz8w<_=/Ixo2I9'JL-UT%89?E ,EWVMDĢ] A3wMIwQ-sLeH1-JT @E-,f.UN.(@emC#674tLΝ5sD'O2>RZcoSo#n07(ⶈ1z0ֲ_SBm3G(%<{zxqw %.#fzQmx+> 7A8{ԻCo(1C9V$n+^Q1vE7!+9+Ҳyx߀m߀!n[̳_1tDsy1hn^M1hn>Cd#*ΥZh~`Y698|Y{Z柷KX/51l3kTw1~5hk]kL>ݦjkYdW|hԫ_6&0N{)]E{'R;wKj]=v1|ED-~XF 8>H=x{<)_{!n={G!*5u)fںnݍcɍ`1>q $/vNdihpa[0w+8<k>BZjxߡ ҉X/!xN"c;-ppef*1wF5O8nF65E5ЮkjM=\ Rd0IU_$яl# b# ȶxʑyN|``h]vZ0 KC<#BePqH! ܎bW{)ΔE̴7e1ĭϧ1JֹD'8^xG=f+АrC3   1Rn>C +z1ܐ;sfKK)מb໏-~Vo<׆iR[쓮we47\u޳ F 5 N)!+% 1|ƇƌoCKe 1\jqه5v=zmWkN@m6ʸ>gM=z+*6uԡ'PufyCd൝+s"g9)2JKk"s+I-<c-]G]%4֥㚥P}E;27o7UWK `H̊$i1WAMP4E/^9 cfGl xqEs`k Ws;;|'vL)EXhcn {ܩ h7,fl0 κ 6xbjEPpkQykqYq 8:l U` &8Z e1oʆ|G-m{z\t됉z9>x6a ˗<09&^r\wU>a,H$hʄ zg Zͱo,6"]9Vq>.6#L>g vrB)5~_ j`"z']]l/jH~?Wǒ:%MZw7Gsw|ue}p2ړ,_h"dJdeRت<`i0& @*3ǹ, fO\Bw0$@NX^EWJCCD+KL^(>R:Ftp3,CMVM"+r35ǻNDHq^z"6r*N {WS]#z bk`ަxx)֔W.ӌ?xNCUک**NY_'Toh@OTo7r{A#7 y_K5dLٷ{SGU=b ڸ!;uꔇB򦇳P7+/Xzn26fB,ey2GьG,_tCa9&0b>izŚq1/jr. p{1 _=c?QjF]CV5ouxG3 ESb8'*ӹRyAiqDxP&WDt!H jt\Q۾EOps7epFإ! kzEUKm*q1Tvr#+tSߩ-sD9 o @sK@_ %dPbft7 ->600g"J`S@\L))DHob|1D-WÙO/he Qop=T3#.gy-V_o'l _OV'΁gptUF8HC]7%_lO(8ȉ'trAPTN>`r_)M*p1I}&̠NEMyQdW|6$rY U=p-W gBkby^̠D=`5Q=pVcDC)3-x䏛HShæFyK[}St3d-/M;j1¨^ܨCHLRU XcevPE -a7]}'I N@rF$MD")[;d˻Vn+,u)7^trAvpӻ-$>SZLΓ;Z# h㚢EI,=߬[&$O޵Ҙ`hyv ,`Tu{msGlW,ۥNТ,QH~>WΆ ?b{8\J&A1.vyW+)VҠRmyEX)r}"!f7]Ĕ=Z<0{_j5wȸ?eG pl)7iiΩ͏c d@|' Dsw '`,\Hw9Ёٴ]թesv695]t˰ >4B.fАsE<`e <Ӯ2 1M-;zU$ VE@(Yb%RF(-G%!P?j D~P N8er)?s/J.@$0ϫ)3eTh__(І)W,ṵ"7s(t&$ʥxd}9GslPRGIFY:ϙٲx`@vGΩr s9g8B`L˚F-tg3D 4|Zme!)umr1ZCVJ <ĥGۀ%6^i'l~}}{tJYtvㅥwʂX[w8auIm'G4%" [p*pgFfe^PehBmE =K!1<2. 7{ Jz;xd\+̀uI;Do9,1$)UOf Ȫ*hi*KOd-y- d NsPbt tO;1Zpդay E!5XAqS<5`w "W2Y# R a$pdtZO6  ^`;(iB ǢO5Hp[-#2~]FXS+#?XzkX+7٧ ;H:ԥ?^+Qd(VxTFꚝ%*) ^B"""p/kYu3|fbz>i;ߐ"53_J#ֱDUx i5?Bߑ dp6>$ލɠB-g<[H .~ q#gYs=*IO'dX?K&rR[gPh[aT<]>MAb[DVQ\4˪޷-@\} PocD'LQ¿nPmc^z**:x\vM: #Ks A*^-R(cʊuUC[A:h+'- EG$W/_B;UGB<_2F\ˤ\S&߮rG3 _\jrR|^Χ'^[RKzWذzr5umvڿȵOөiߧMU/eY5E/Pe ǩ_QsF?jW <_L<ߺ9ާk]7k醔OQ RW1gF_aDHijSbfFv}J1Ίig%w1%x|Pnϧ6ny}hn$[dd/Q|獧] e|=EN[vw,o՛m쐳"\uc j.dnZFL4TYvVwtFZ-㗼صrʛ5uY s/k$~mJ~qViɬWDYTxYp r™x :'kF\Y4?6eJ=d!sUWE]m:::)ol߀b1d1Ăr ֖DyPzϞO*m KQ}[ w,|phHAS-CԆgmBK ,L KpD qF<d^cRM?=[AهA|GaM짙e5MG_u`˥ǵ;[}X|rwF}POm nhB% */EG|J=O/=(z/>(lm d4bBB۩Dܥm\R^*h<((:@wj[Ak Dܥ8"3wowAt5x0?R&I^Eکj۫Dܥm\^*֨L$b@F7wjxL2A]lEj%*RtGy8 j-_qc[@fe͚zQ1hh=>AG(BV+Dܥ "]3;vB^o .Fx<{ ϟ/`YSoU W%j?mQI0éS1F:PZ5RQD(q"/v~,$$|?nFsAG c0M2oP$xr} Q)(ChDBI1gV#(Wc U"R1:$UHz3OjMJV!;kDܥ+A,E(`9 %a9^ 'F >_H>lVswinw Oo֊G'|c&)(qqq^X0Ի0@CQe:V.U X(dx!aeCu@<(q>~t_>;wCC{)Q̩8κ|Sܩ&M&1 /?'}`8.:^L*ytVwCi j}u~114%:aQ(:K). qB;mE1IUI.7 )Pa2K^tɋN70B .?P=" |ԻC_1.?MQd is!HsS!GW]0q>%ɕ5ZSN'tB1FyΠs;-ai QE-x ~N^UxvFѴЩ2FWj.='Ȗ\+OL%mjW#K5Ag@f[!jCozAIBާ xϮa+B;e.0UȨ* e:P`eQR"9!` @VD@ Dhf/ZӽJS;NTFGRRH[ѣ+`W*i4T G$4ڻ&U>Ay2$hDN,9e7-a! ȼ31 l+ͩUV0b.Xc_HbJla E''7G{UQ,%#nGTvPJ<*}4!wڒ*@mG{(K![EpB+\tz>1 pϤ<??.nOOsW==u[pߞfSR,p\gGmڷ1FvSKU隮j2JWR?ӸN޾Q#'8FAp>%X5ur}h96A+T8珏X5uBmQIA'[nSOGӉ1yJ{:p:<B/tIaSTΠ隳MjR.2O\hDje~g`[ZSo"jKmaK㗗Z(DQ2Fj%rX[UVc4K`;4 !j8e;@0jm.}:E/tw"9HgV^UBkˡhC#hZ&l^> 0; wbfd)S9 ţ×`^MtN\-?,;*sAI&!.y Y|98:9.O6Fd5ЀAzw@__Q 0 e5PgP~6 r۷1UfzQ4ߺrZQ+u*m5J@8uvj@CFbԁXSu @A܂圑s!O7tO磞}Y/ e+p1+?A2|\&=@8 NzPX ?N*Ctqd]+ǑUSIc:I0T *',$]`!04M >ԣaI!+Qegt)R(l1gR֬@£B0x/QLI;uV/[S\T !Ym1m,8rxykNtoc RBiCJsQܢ_(Y2ezװ wh؅[6C`k;o ח0_N?n`hB0^^k=%q~QNes;r3խ˟|yoߗ6/ omW&<+ɬv yUNSp{T9aWW(bVRoNA~6An2*膞9\x ĝ~ƸDRB X#.F60E (AV<@@3|Дc$ԦJP\F}Ps3[‹7K5.Emrd u OѪ)cD`yӁNf(LʯNi .)gģ*gG*S*(Wuz/ k䙫mr,G9"Y ߻}U,$,ٴi6;J SB"%T ” #(2|]ʏ#_+fkh@ Zmc{+ E9/;Rq>ـ*;?p0yQ = EiߧM=)'ޮ6䐁mH5jR%JĽc /x,f=Z: |Ydq7(kn{۸_!na`pKv7Mvs,)y[MJ"-QRے[!]wфyY}FRXn &dKXfgjԌoI(XA 8UTB{KsԈp OȑF9sKUe*$f x?%U)a}z{<3+^rQU!WE%@'ُQ.9ep.Wr${bt VkBpUj}#k%xq(*E=!ׯNbr9h\G8ehaY$1Hz@ʅuG_?U*>| =hLKFGH@)ڱ>|&~b_wʸ~xTnMNZX<%ؠ\Tм4W^ s-2B|Ʈw/i T0 `QL%ug@5-10eW/B &h$n?dt6Ak.yU!rIJU9D)D]!õs\9@m1Jk]k'|K[3xa r)CA1=ȶ8$(wْ3j0_9zߊ=qcc9p6lu=ejb*>;ksVzl5a˳{T't\v8Uq޽ 9$l>_OG ߄ߦ嶥ꇡǩpAH@/v eW#h_R7eV0MqpzĽYvjGULw] W/g~γt{Vb }TNCpkzDG=sqovt!œoM-KMݘ.Y۵Ϊ !7bn .-zɕ0L %1ǹ5ij8<$in0X㩑y@oyu⼖K߀R:< J׷thY*7<ܼowλ;:DFv.{`\ N^,T6_ոXhT_^,'ݲE񬂆.j28?'T0sh*8Kg!WtV6Eب#`Ŗԗa[Qb8Y& mmGHBR#m!JIaw'}-bi7F'ɛd:us A4g*8X; ]I:^xb.[q˅`\.7ёEt$A*a-b `5.N 5Ϊ4'7i^\C6 pH w9$K9k+4)(dc!}iz17Nc!gXfL}[u8ʾ{*kXA`٧ L @e]Ke <ĐfcߪoqcIo;%HtDk  w"R:1qQaVOjc8guTQUW'cCEN^PҌsI]{ǁ>s 䢛zgd}v5Kw0t:p~na}̓CrF-$ F8^e1~-  ChsV[:݋"$e!C.bܠ L9->Adi@PÏȊ\!s-!z˘~Zӎ4R&LKA2bR`pL0Q)p)hibOShH6 A .H#cJX~ŴjcO` зsBO W)B|Tr)1nZ|%5&)"Q;^vqCa5MlAf4VeKe$<t_ 9Ex]'Fk ̊[(|q-&L%K63Tr I$dKE ,ǵ\^Dֺo91ױHOkYd E[M0FsZȠY.|!>^k pIF[bΓ%c/\QsX .؛w}2h+ivZ2#m4dDNCML fzAKlEnO>ٞ~k|^$\9ݞ9 n)䩥4pܶ "*j.ׁzinU:Ġ.zUqF?~Jr~*[{~*sg[}7ets~>zBK@=YaRB8|Ql7t;;?[phv_@8D)$i$+x )0HY+Ty,T+7 ۅJok7l%xu1=1=xc~<`$|=fKIaB]K lɵ*q` 7LkA攋iyI(ھKV@*nJJmС18(/=C`땢BJ|;l]Eh]Dmh] L~@9CUE XxJuCI(\X(d)(miYy7yLgWe0L+/19"TE/0AT jNu^-%٢#SfF?>]oV̿;TAX]2Lpx BڒSN*:֘5 86A({V1 {+;>;Cbw@̀SĈP +Qpm+( la ib %p(bTɇ3C5R ݕZZ_x%='Qk^^1w1bKwRJnQ)oF̅;ÆwZ #<?`ޅ~\!PD8kքNl<&,2ȍq H9 ^Z:#r6s4Ѥۗ#2=|e:Xe  ~4mG@f碽~>GcnbJhNלHhN.q[*o΅T!~0ce0okur6-R nG D]HRʌ*; NN~4 TEJGѨwRդ|L")ӣ (\|:y~4rǴt#WaʹV49Mлh>mGi$/ϙBBXxr%5A4ȏIa!eD?qٮBjnb1 Za+hMiTӳIhJWՍb$Aʄ 1a ff_<YRr. ]uQӇwop#_2꡼̱ԘG*ն8p"}uEb7?+!QTlۓ~骴>zE2KϢѓr"Y_X0Cz ’+a/(JJcskgqyeI/`"3F+Vga0#+ x,a7##s"hA7ޣ[O[e>m]OYĉICqQynFƪ0{1dH?@gf'ΞÀ=0 W3H,x7{WFܗI 0[lfp,Av˒nۉ߷ؒeRllMc0V!Y|XUl+c1_*lggn{Sj8.o޼/`fwܬAʅr2D23&^&'_Ētg2ԕ>քbh K0K\GI5{N8JNaR^4G~.p]敉~Cr0w䗃UX'C=`gK);[\\,]uL`cDNQfI/& H 1*݆?av ;ynG{{+RGHcHКh ºg|eJ1.@\PK9 h}5}~[QhpY,<|N9f skQX] ˺VJ2bB!kMFwmzt#yW U`g[Wb,pi0(#^)u)aTws¢eƋVG 1xKj.;~<`T1a1+S]Nv Ysn4LʥFRŊTVʍ ^ߌQq56T4I*pD?I#08MΑk qў+5@Є*x*p3/~VP [ GQNPXf@:0@Q0Q( έrc-JqI+#zRbQ5xloXˎGJK_3fFZ7.0]SElޚn#08xXMTDxfj7+f)Vֵ8"Xedaύ'%py/ݧW]qND(HTh=1mQ; u%kvDM?\UF9] F]\=j`/<}a+zaHv‚Q6^<-苙iC"jc0[*~QnaU~ٕT)GFap/o\Y7fˢ+# jb6@ A1|Ff3t `޴na2I.lN&mM8#t<=+)X;.`=]ϓJ@ ] F#4l$`fSQ!] @( `;{JH>K>W:=a%P3A8žxYr,SBاce!9Ş J+8 ^3.HQkJe=8ɼK6ޫ Y$p0R4\G. ,ć\n4V'bP6_߹]_L~[qp8OKURtzd'C(4mFdY'P Cجs5 RטV,bzdGNW)m$o0J,qzh538°596 )={e*NOXBU0D=򂐌bp愬)mh0(42)W0~ dXk vM򐍮]eU>cN{dl N:Q_4l{IzdGTy\kl,yENo5rԧa'I؉nKѝV~jW`(S",2Zv d#08 ԒIv߸% ç0ӂή&( 9" n1U"$F/.g%l.SvbmR( R- c]&9j'{d7*X$[O C8C^f@LOa%8S2RR$.Db0=Hs+Z Qլ>j4<%3AEUBG_$>Oծ(mI*[y(Xȃ:UUNZ% hodɓR⍫k:D>( ħ$torȩN&[-U05$UI". ŪAtD Ze|d!3-gӦ-ʾ ~R'{aUCLv vw7dZ >^d`HH5za@2\Q6I2v[E~n@RiQ%f -gf?7}QRaT>wAzJ-P`X.Vî1Lr^ q53raI7#:1oA)o0YJTC,XC^,d`é4Q %x-ʜҷta4I]Iǡ_^m5_?|Nd;n^ee}>FsK8#H Q lT&WL+}X֟6 ?rcypa$26JtA~!X DpDXz&C-IF@>ltm\Y.e^*qear +( N_ȿz?ShIM4cQ?K|w]zd@@,CFq_UX~K:xva0 Ӻ3a֛R\GFapd~+}[߈)դzd'CҸ:+ta)5ex-1Wik (I\Y8N|ҨʇgO6dѕt۠ƅ.VU-1IX`.&bRHE0gR55V|4|`NSc~J>7ǙѻTmRI&`pvѤL㑃F՘TqŢXFOFˑ"GVY rT'J#08$WX % \Eko]sbtf,آu0*ٵHJ\dd޴mb9Gw[-_'Ρ#ʕ'Y=2 Cƿ")eS1zvWRmqT~>UؒUV msak)r8Cgxj]ˤֵ:EcbRLݑM~MIp AџO /0Z.bmdNwga7#^JaVk^.'%g{hZL^}r x%7i=LX~*ˣ݆2Y.|6R{046D)-^̛ӺE׋~uX1j'.#]O%IjR&Ǡme݆ z< qeq~;"'m4zݛtټ1QØJ]L.L킟9@9uͶ9j,ĽV{+.րhݽr9T+D)J[86w1?uw-Wf"|XE2q?[2l?] ]if sYKsh̦7}k; _<ׯQ_6rz|C^ڷ+3owN/!"s?nfMؾ9 'am\OA_Wtx=D~ך?--=: OEOЯ:43gί]cawu㶍sTwj`fpjᖰFj Ӽ7[cRY466HF7gRz\b1Vmqom`9^_f0"]?K8{jTr0a\c#} ,c^rP$h呖!╮Af3c6z._.ܿ^–#|홯=̳!z lό0JߓlUܲoW,gg9Rܲ5:,u`f5brqb#VLy)'Uy5 \(wp)$<3`M5欦^1]aJ?Ӡ4up#o#bfA3$HZ! `2ٚ;mh/ط{k@aYśGrF@ 7ѸÆgGEEm2\6 qM-u:z…T?#=>КeҚeҚeҚZHړ&u{b$_ Kp3"اL.f 1jSGI4»7k`wsn/.8kx0̘*it9R47?vlՔ|~~zN1Q?>}KXKe:]{wᗽ:8~۾iM4h*֦iuQ ~{ CyGbVL[.QzRr_.E_ii6V&/nWzџnWj%W*D[4]T嚮*؟nr7nX&aK=JQR: @Uz6cdXHZd+籭(+/eI2֤F}Fggv|m$aQ50k `sicm2JAeav1ޓ}f'6ltBG_[iaw}f1Հ!ֶ"V7"J=AD[N!D $5ιrDw. @skT. @P_"Tci_1ʈ-2^r\i#O *oo%6W0͑2`r\*r\3H4S9GBΑTr$9 B΂ 䓦9 BN $9 BN $9 BN $9 BN $9 BN $9 BN $9 BN $9B;'AI򡺜!'AIr/Ix9I2bIrr!'Ax҈GiYN#!ON#49@N#4$"TCb46~[^N5J --% [H($?~#Ǟ "k/%%\2N,@W8֞pk`wCW^6#PM`ai6VRNC|؛{SO~(<$٭[]ɋ[('K|+P3pU-_Oݺx;a;?$JoB/-68)R]LOvsm _MVgxo61Aхe2iܴl^eZ>Yj2۫Q}>꠮%NqPշ8(B&࠘<R߷yP^apW o#ޗv[UzoR޿B }\Y])ӇW~A)4ESPE&r8Jj}K^oC`7$񶵍Ofh$Ŝ4Dw0Z\I#:)'[{ ;6+WӸ?YֺYp a=N{hѬa?K ψ!=;iPœ!{ xi]mMuZUt_9=K}مG2̓5Mվk%ԵJ^o&zeM纮GIqjތF߼Ya8;@loz͋*NqcҎ\y`8đ1C "H56)6U*Jp2 z-H_'$G\=!cmp!'J0D\HI#QɌj'tyԚr g#,W`',ZDNppSk! >%-XX ݓD?=blHAg( q^:TFEZ!a0.HJ@DP)N6E:ZI̊T#Ey0D:LhsB3E )ŒGSiK ŵ G.2Jdtz b{`<Խ4! XD@GًUZ@kFFT`#6$y&ݛKUwQ=cpU @"dǚqP*2BtTc4l ώ*F3ϵ81q5?f=Xҡ 2)4%BXPńѪ3k;Oteyhiy.yb^գ}ݬPXٷoY>e"[;Nuޫ)%TtXxÚ@BVwhLipBG _@[kB?Ÿ-L+T )`07Zy& V8|@Í>~2LBUAfWyA& 놰a##T!UGP'@2gI2/De T3m2ٗ6(}%5M0`[A"$ZMSO!T,H.Hz㤍%߱m%&5c$Nƅg0+~SA:.{˧v;2k7gCyd߼+?} ^~IE:UY+0G.9aHg:R cpnW 8_[UÀP\:$+w*Lm w AaXTcq(&vc*ƦReմX|Zug㍲%&X X*Mw4bfb~rW[K,O1-6ۋK3ʏՅ3 )̹AYM[W]sUŬʕr $ _0v΀-1fwqH_i܆2+g/.93leFrlb$LjlɌ?fZXW|T mcy6OgrUG/6\Lcd[)>}o݇_޺HRj}}}rO if}6C *&~M׍\sWߞ=gO߽zWWg>{g0z? jtsT|;!^4y.zX罷 +ۅrϦdSwg˔? Elfͭ0'7a#lj~)k_n(?˴X`=oe;]'e(}QN( DxXNkfL)&m$j#ksQ].y9-r$ ˌ7b  lmOȄxyg!:㮌Ԋ-O[КF\)$fҜqƺ+SdWf/GoM܈qv/[5|F~s&,LtsKJKNޛWUq%2*Jɏ:2gkUceO$GMr]c~6ZXmhI vRonE|q'x=H*UM8 ͯ4ee0Q0ǔ?[WEQbZk}܇UkmoRxv0[C'+/zXX3cm- L`hPiPe-kCuTֺԜ']9ܔ ܈ز۶IXVv11=9c XNJ3wʀpe_Ŝ!=9x}db#}ޅ )Q$b4yFԶ mӽO v*֜N| ?xuf_q6s#jfڭƬs{(P2 kWo$Wa!_fïoK;m ?vZ.?O(Xǣ:xÐ?+Uw2pfW7w~X7W"IlǿXXS2n<%<SƄp+KMˉly+9bO1ZkROT,綡=p^Ahut{QZߺOt-c_ynF^nӆ"H[VO~vf_hn^3~'H I!c)G,;e5N~?6n/&x3'^R&JO˾Zn=!Vn0S^Z溦}41DTCU4=ɋ޴BVfG)c)tiht3Ӆߖ/KWIW7e9'Оw]޵ƲFX!1Z"Gssfk)'O:ZڛvwSؾCIAZ[* NIOWS#҃O;r&{'͏30˫$.˴v ,orVº^0Ok~鬹O%?G `Ww ]|utwzNdhgOWLuȳ9-?&#[ZY~_5 _V{7՜`MCD47C7WIy>cn_K{1sv\ {wfWs3sInssBG. YCe=9*4904& xɋxyaȬ:x5|L儐nB9Up✐@7.D?> E5 o(GﺖI|1pcS]!׽8@uW.j[dp%K;OMztzˤ@ ~_x9JE݊؝TxxSy]b}lsD݂ ;p" 9x=z˥Ր\zuю9Bο!9"J;GX#o{!7/EKڰ[2!ɇLȗbӿa6g[&Y+ SN˿n泫m8Lg"$Y-1~LPIf[V{wܚt0R/'0vnx;MWam_kgxmE'\wx? ZRT ;J3'LFЊWqxbzV' |,~\^¿W M[J nݼ깭~?~]qjKo)puY^[}M|} _\&P~i~:I{4EºݮLʙגEAo%ssJY͊2K֧ؕF8\͉%y3^ӒΣ&}旯)h`&M1 &`\l`  .[ABL^fkV\4#c uTxc#sĢB\PIKׄ3$/jJ)B1Q:Cz;!+>"αE`CB ^8[e0i( 1);^)?r> d$0C9tH+ ЂXpW1%BRH(* y4-=shj 8zgH(ްgJBLfZ(.-J@Oxa<#xao/Ckp4|vNwBRi0tm1GCRNȐl"Bwx֙61%i{QTX f(4[0ֺs$ f2"NCLô.!J7dfP!{Q#9(bVRo" ,e|ZUHSa`V %T:>ZMGRqLȶlTx1>CB PѦ+p,yf]ϑP!c`#fZ$qW{%B#[m$R.l9JD&p!W[9 i1B*F̍3$Ļ0x -aAw+?^[ɀH(^Yz+UhhaN'#x/f|%)~v `rFqG iw>X!q^ 9JWO0 9 dͥY[IVr>pd3GB}Ak)4UR&vhs/GB 򆜼svP}ntkSf8?ЯBfad74.lj!qhIҵ,t%COh3q7;HqOv]Lޝ7eDs 6Q!'r50bV %Hb/d`>Ec$kc>(6 ^C3KL:R")9IE$s2JaQiƝVc 3"Ay ,X(xnv#Odm3#$p5*"#Z 03gɒ2/De T3m?O)Rh?f"iEDr315qP*2ƀNp+,'Ral}eZ6`(G[A*$ZM :A$e;D(vٖq3WM!x8>nޟָɫڸ_l[z =M% =jk.2:_eQEJW|׃K3a.OnS~םAIU4v8s 6D(3`#h <)"g@4;3iOg0J`!!-,&j\O {;gmxFPWXYoミ&q>|\҈ojjyyQ "ѴdT(aF{M܄oa$_*Aiz}pgFUc׷lCc\ec8.+R}/ۚh0MF| pGaC\5]5UC!k>'}T]iG9ζuJGV׺dSM}ZVFenob9 LnLӐ8gxzYB%R1sUp\=% ?9zo6|_黷_-&yw_Ê'"Xklܻ ?OTW?ZUST\UwzקL+vr3;t%@pf8uӫʅuw:0~լѭÚ<@a_-ܼUYo~P7eҎ4{:&pIP_bhMl#7IMAMR [wRRTkUV h[z@gVsb 6'ytho< 0?tI|kr:@vOFmx;avpC$ m93L`QS- cmGY[M1!tr3 =1٘Xq4nm>ꌟsSkBS{;R=&|JP"XF\1US 0b]և`/'Kg v}ٲz@Tʢ?NI3񅖌&'=_a4MFziaTƼ|H_?*Ql{ _I>.òY\%Ra?ay/0KvP&W\׵+874Qi8XhTJ$@)`)$Ť9?r~; Wps1!G-bĴi9dm'^=A*f&-">h"&v u Ѫ(4Λ҃H=`)N Y›xGjd'jRz\TknbΚ'g>oFi_P媼Wʕnò4ޅӏoScG+F@ϧetşuWM?=5-4ּ[PWfkC]C;nj4=xoԆ8}@Rs햸6Rzt s!}"=k4w re}͟.C/ n| K0>` w}MtxaImtIJי mG`TԙW`擼N݂)F5r@Uv`2.RRQDq-'א){c6\S L8N]uڥSY&&VQR`*ƌƁZ0cĜE NVFq֞8qĽj^Vӏ3.9PYXy\\lubPXPave:ޣ\nM֏GCh1O?w1xU}e_V+[A BgOLfի,;J򂳳tX1{)RYT]r}Eə@:ժq2Tz_0%TvޛC 0K oZP!QG32'!}x4=,*VfIf8%oC9DŽRn&x9PGEdQ0A[R.k%G$=ɣw0@a?BN[NΧEG3qtX #=m{x76H0h5J=+J9iqww 3-*Rb a}9D i3))JܠdFoÍ Q:<+i:@ĒE* B{fh`$nbi1g;JbLH5g7~x3y^y8 eAxeH߂zi|ա̻ٔBg2J,ZDOs0+,BJ"lhL9z{!Qޔ盇}' *xŮľlys Ո#xfGIL J-щ"Y#ɼyQQ.iu_98ǻwiv9x EG 3/i i1IΥWWq/5~qXFY>׳Y,`>RM %-| _ 0/ո ĕz.qڮYv v1ڨ1WQ"QU~MM~ܒ$;JYJLԭ"8}v8XBOFDMƱG#,dtAG}#4|%½]:|XW)RSI +"ִxPTӦ6iLî*dB{[żр@AU.)ޫ]؝.>۞"A-}ShW}Hz|;H3DŘ_\iUlY޼^?׭*)gLVq9_ܶ>_)wفɱx=F9¾w S9.`ԗr§'l fKH']]OQ'<NNzSKRo߅qe. ^׃-,glTgF \lgG/kvhxE6<3j hH2}H "2QęYômPғ62哕"qY)Jg~d nIs/2 |'{W留rOS$9~&>\B,DOС*26=0/Z3AIʒd=78q7Zp hՙG&nQe}(nagGMʨAnPhV£qry@Z`Tu|ksɮȫK UӎŚ/t0u"ǡ7 ]ZlzY%Ŭ\kY_qRxNGtv}ox$7Ҙ܏}Bo vyAçbhAsHI-Pʇ"abY玭|a=:_h&pBU*G=A ڵ&g/%t[C5&/>iqwwLрmP TC K߂>Ňݏ&)ًNgO#Y>za18?D i3HHfDVK81Ssl%,\c%*qȞ,&X{u!=AZ3E4RR%nbi1g;Jaظfn#7g7w ěF2ZF)whSnVq:> S{+%Rqܸ^;zUHS5d>Z70㕏[)Q[['bdF/cLY\{)C]-g |u{߻~ukUԴY¨ H6*Yr|4 j-$* %^F@HHLBDN.lmIAFw,8wkxHd;W=÷8%)Q_0;>IWDaIHe8(L*YTHL!/iKHD 5ۨmX:hј@T(8) keXޣ:KbVy58-;J(CRIS"w|h]% zݟב} _U]k`Ŷ>qG\ØMY]\_ϲa}eq+b*\$|=o]kI_)M;';HE0OeBeLXό$!JʘA&$SZDS?UXmxQ>UخDt8ۯ. q27_K1 kR,JeU̠'d7p) έT|9vվxJ 8SL))9ӞSE11.DXPq*P#.jA"=&bZKY+PKQ_nyYXAa鬍7ױ*DrY#XqjkThA)J"$OZǜ8ϚmE廓E?ՠח;sebIBʹ?Ȧ#h}TL0їHcB gc}h_RHxhFtiY|,0 ̫ϕo3}gߐ*TR1Cl2@.X:d^̦_qF0J/Qz9[ $λs25K,%bY)QF(#'NZfWǓ<>p^wat3)Wh<"/N^ԸAf|Oj*ྜ.h?@չˢ2?#Gm5RpKgFI&h7f!)9ٷ v]FP]?'{.ch|Z$(Rj L! `BPMkG0X :V$y&)cJI`Yx:6;#Q0eT: @ r֖sf/'O/45J>+MV5&vL)ꦸx[`tq7(|CR6i(fKl0ԹILOT)JSŽJJ. @!ED%õo_KB! '뾩h0uXwABT.1Q'BBFI2 *Ef? O}M B&)ODios N'i?{@3D`MGHRSýY|"@+/Z MBIF ц8U),GAOs6ǠqVJuɦY FY'94'>}fQ*283(Fu_=/`aKD?!;~޾{]oQ O`Z@?k\{ ?Lw?>mMͧf2ns׳x^MD[nw|r1 [co;}}7ɝò9h•0a+l$RΡyYjU\(?*,H~PEhe5~O 5on/ت}q}M< :HJ1 @QTqZ3cgJyڶ Ħ u6MH:aʊ|~I9GxyEE <[ 'F)$EAwMMg ԙx𱎆pb@vv3Vwz%\੡مY:jQFYruÍM[[r\.T:5vLS}ԋ{Mڀ2ڵ8Юҭ/.1e2Tu9ږNrܷ;8ddzTy 6*$ 7>X$mIm""U]5:LkOPцHBb.*v!|GUX'-jiײrU7+/4/(ft䂾1|8 |U{I>ZK|yP8v4 C5/&b^U rU^*[*A&㵌(MSDk#'88%|&X"D{_;؁-^ vr[^nKmi{miWC KAHI!Y"K[Ø18Q<{Jtgwcɟ?rbC2X㉱z:HN9R %x0C4"K(q$I$ey`RI/رD{aǾ?TK}_i}'Zu)ӷߠRړowE˕~JO||BCg>ʹdZ3?,\͠p h4]#[|L-wZ4ݗ㞫YAoݜedؑeq5X?'(Y7h4m)p$b\>28颼<aH` ^kϷ>-oǿZIpHscLPQ`T Km"c\{ӖZ[]ǣUZy]9L2jhY0eOձӲ&_V N1xI)Bz #1Y'QEo+;K@{%9T^hHuX9kZZK|""R&7a3+s^3ʜWfuM-vB䲣PW8B83k%*xsq_/#r\'J5Zm#O®$ֽL`mR$F<MJR\.{oQZ{O#]PSd/pq:~*J 0A"&HO!ZGeiL)EoI/lU꿵]k!;=:7{`l&si0 ?N-FZsD Cb@^ q4tJܪ$Hf Z{-¡kM58ꗕ瓯?+/v0}'ӎTOZAC{>&d,dϣN1piϓ ERjpI)SJ0!&^oNxtkh::WoϔTA;a$)сKLg&&EZa8V4К%vĺBU !*9AL`DݭBBF62 *Ef? O}/s B&)ODios N'i?{@3D`MGHSs2U$} #+[! lo Y, ,S$CREW=CHEy [2au"TINq !TU c}\Tָɫxu22."!L,M3俼s G||K%'Wwpn._~]ҟTAIH-pfo{}?QgkYVH)ha]D33ۿ;LAޜ濕@0!`H.j8\鬘D1\t˲e0ŀ/hB)u.BOUfRd/kix jL#Fr%[||Q\}2lxr1U'?|;.x]fܘϽAgzIۊ( &%>z n? cmʀ=z MYށ0)VɊ9'>{7m애ݽ'msTߢI:asIs`.5}; 0fIm3 gG~Qv4jo6k7}PGo_m߾D]ח~,7@ilCAP ^?~ך457*VkKLԔ.> Vl9b3s [_n??oz^otഴ#%g~6MSN=KfŨ>*,R܍/T"e =PEm>ᢞ\ߺwAU4l|OxCbI`q-*E֎J,mysb 6'&驽 s9yBIYUZpA8(V[cBwȡ$LoGָ:Gf@ZNkg?:?cvP2qvj#dBhw ڤS@\/%~)U\沵̋sFΕ-v9W@@R6J!۠LnL )&].S/zˬH.@H y @hUi HT)RO 53K!0J#68R$= vVk/;^s%cm'568i]!4nE9H>Ն?\O^>nӥm[]5cVء##FDKdspTYOoc\Wj%Xڐ P9s˭# 2%P[ɼ:$8)B~.\(8ȥ ?3@cĜEvuTk%i7ѳTn,B-(BېT5udSx.O`|(4gYm r뎐mK?[O-4##Ŵ^CmbmwA ϳg)˖a`l*{ ^I*«a;w$)c `,L!m]MC%P% ?ǴMwj,'--AWዡH{TR _פ[FP RYwa(ܦu !d SH_0'n xbHަa l 閖ZoH[.Bôf~‹r5lI:Gx zEK˥uJ6FmZ6qpDgMX;ulI ogS} Zf2d4 Wȥ6#آNV6N7ʹϺζ̊lo].#7~ѱs?t$5o\,3Mk hpr&ɍ!ךJɴlVu?uؿQ ָ|4~π,F,B0#BXYLã)#"b1h#2&"`/`s|4},?b37ojdss;=3X}~l,޷ӘG"h@QAsglrD`0 :d`:}<N[N8^lw@nk|TD]dB!JX{s|\ {n3"gkhLrJ\s~Fct𫌴op^kvOAᷰ ,jwgbhWj7=UXJJFa.a_79VD)  KrY;2 :.jb&]L¨uF;qu:;']F&㎲!O[ӱ‹KG*w@H@"\j؅s@3xo$pGH(򹖜N)bxЄAf~4쁕8]D:Y5>fkhB9̶R R1e7}|+8D5$\x#s)>7ƚ\S5*Jv1/"F3"ql2 eZ30{-#cX̧k; 98Ď{ъd[]F0Y8ۛev)r]޸F NvOӥ%p4-ڴ2$-?=ȝC%HG&MV? pvgS OOO糧w7?I3=M{[Ҷ.kyn\yr7?? s~_ϲ=  Wf6Xs>o566~>Dut俈߮ϺvBK- ʩvǻ `o7s^x܎Vy$^O)ڮ6^X7ӗʼk $Է(;|/]:B<1gp7~WQ^H KN-hvL~Setr,Kp3"اKLDX,-,xGI4 HNd;YvGR5s~?*5E #1=-Y)峑wA2;x$t\Y:xbX::˫i K׵^MdJ A&iܳ8??Ȇm:.t&*K:-ܚ t YX{Wڼ$Yczw\KQ=_tág-t;<28nѵ@BjdWLflR&7w5MeAh5CkUVBSq2}4&>!c"ߦ_ލʊ |'g>enAD0h4F!)r"q+AZF4fټ+!\.'IYf8lf2Sҗ LʹHt|;hMd&dr}KX^E"ͅ$ TΥ N/__٣49P~ k Nΐ(ۏ7He6r6i`y8eϟROc~zVJW߾JOWVB35 e)Ŕ6a)*"}4F+Rt5&#_d-sPtaƂKޟ)hӛj]tyl }%J X6X8t@M A άQ)G;t *:\TnMBS XIJ )Do9:2G,"Z5)m:KΉ@9^2^9Q!#Gփy 9V?cfxԿ &?^cZ' ^R{r/H炧BS0X1 ȝ&M[0/JdDK+%tL7TH^WhH,ͽu"g\s鋤(l ( Qƨ@THH+$(S%`XJ9C"23Ih66[(IJ8H^M(nKz9ƪD/tZ ,{ "%N#Kyl`-ƄQ6\øMCH4@A@A"l5: x( 4Pޅ8sHYqDS]Xq%e3kP %J^YKX`V0 D/F)DSr+&wZz@U$  Q~Dc@5*؇i\/3 `AЕ;e*ԭ _qVo|78m Fрm6 8#vN]C҃)1NOyPm3{\^ʶzSaˇ(P[.U b"jo;N 72D[ *nGV~G3`Љr8wx{j]iϱ؇mm|g o6i2,+)H& aGaaRȬ+ 2NxT%ԝ)l>tP_2_hŮd߀o]qqnb{nlxлM:Qո(* y +gH\x|nN.h\ OJVVoQ_uФ+k' <_`/ =a~sI!H(J1&soD5[i7^R/qRY %ZW>RL@SNq飶ěU`̙ K1t= ٻFr$W6؇yCoM/=`1O[kYv)媮o0:aQdLd2//wQ5c{ƃ{'*3<|qF=LxN0+D{-"#a:0- 9r聲ʾ! \HL3 r.# ҘRd10BB23uS/IN Y$53Gc9diY$HZcJEK"cӋpy^XҿMeg4; )"qEſK$y/[C&ZIZM`XJ>Dpc|5Xʺvoynt};_mmF#/_YaNcFi6ʌ ;6`#*$;A At(Oq>zxǎM=z8>1κ!y1Nd%nM%,I#7+\̱b⌺)M XdhMc(%tv B.̮0O2t/v<G}dI#QqRMN IN26ed߇Zv$aLAfp"sQh  ;!ȱE R?4ՃhԠJplEh1lDRCR93mZ 7Լn,qwɫjh\qߓ3A~kzӥ(U |]f>~o[3&BgREgW0;j(h٧L}U'yv1!LzDW7(uWU誽7_  Y8tBKvWa*鬻;[}n?6§۬ ]{4iJ4Om}z|xxksv*MNhJ#FM—'X'nSFM3{W?&w~in4m/|\fk:ܛ?8NƖ.8zÇOf#L 'jKo龫 nfqBO)Ua8>zf?pصCvX꛵Zm00ׇKU?ա{Im\+{s{JҔ䟓?t?/?߿_/Ϙ~ V?E᪳ .C:}u547m*K*S]xGhwU|@y[ْz>_t)mGINU,~\I]}O1o24jTssQ!|J1Ltfɖn,[tѾh-6ْ #) Œ[UjY@:#=@m'HzmoTU8'T#yƦ9/$`px[(Exk `8jcm2ԴʦNML} Ēf@ 4vk':-ll1Kɮ\> M+t%&5%Gß&YPqeu]qDTbRcW6(SY8Sh"SK0'k"DbJ*E)8)`XfR; H &.evHysQ PfIٺ'*7­{-HvA򤧂||џq"G}Wo??}aʥ׽uq,DHt~, LQK$.'+-Ǹnɀx ,QUTb*e"XTAI$t{G2wN1[gg}K8)j`kB!1qV{tHbaJ>FY`*yaf!۰o  %ao Қ UmFwW!]~n9>|vx|RӍϸ\蹝#LuIK9rLHvm1}ut3ݯgkMdlx-"VN$ez q$}.80jI[+.JqTQ&W Ȣ`;X`X+H{4 p$`]EX^9B:]#0G`pJE;! qճ IX ̦0䠊V0#(bFJDc`73\Ho]0s]:o=vo/\y ҆Ŷ !-MǏc? +3q|C~by sR 5T(o+LuĢF_;xZe)Ŕ6&y)@}NVJeL3IH yNܝU-tNk/$|^03ۓNlao/.Dj}N}8;Ԟ\0m̱j'(fSsӢRŭvz~^UgZ#nbRR:\kT.,ꈦ^VSU1de0b3l-xǷvtjhvWcs$s*^hܠ<$mgӗZP} :jy#^}_'srڝi8oD[`(4-?TS:n4ݑ@fA)rsr%T8 y1=ڃUE8%Rb%1%{rYNPȪwI-2Bu8Ivonwm`C v/iNܝCip.^P*BSp N9*vM7I!)ζ\BS,e7[[j US87 WM)rSpBQp"pUSjiUSj WM)\5pUSjpՔUSj:JAiaR` aSN o1 D 0Ĝ̊ f=Jt4sNóaNL̍hݱς&i|QMf,TBj/RYh\Z}b3އ V U`he)1H+hK!BIC ?D i3HHfDr{lj0usl֔k)- |8J9G/=".3rs=G+scLF΃LONb,^dP(e5iFa9N`R 7P``psoDe6iugQx͒;Ze;l\φt1u;V|%D" g#Bp).}Ԗx93˱¿HHSrXv'7w$n<R2÷A`&of%e^8f,?ڟ5A$SMØ:E{E1H'8` UG[D:(&?vt7/^Hճ(P8Hy)KNWa*rmAaHX[ /C˦ "E^ן *t~~c5 FueZ!.M4bNb>J^F|[S:+'#|,Lu+|[ꍞ^Gq^dӨMkHry0GϓK]C6Y"gxI;hG`粳| w@sߤo߼w?}7oc_o"8` )ȣ&rC5jh*fh媫 ?B%7>VlB%-w7ތùKFwu|{IxD>+0m@BSNޞ:՟7gZ៻ Q]r̗tY>(Pp_d&yکݗ&/4$XÖzdZkGtmA:#=8O$j-z%驭 5 %Yj³ꄒYSQS- cmGY֘;u:q꼨✝ātwAWwP*T E,+:LdXt h"WKęRd0kEΕ-v9痋A@Q6J!۠LnU )&}ј3WY.\  y Ѫ(4@R``T)ƫ4a#A2IZ{1[,AJ1Vٮt5n/inIS'NmXo]}5{3u|JzK[/{虤#'M'P GUz  8u~?$KUATbrHxJ!VKIHI;ң# c;]PM$*8A 0f4j|sWL#,8ZI =l0mVeHtN_7#_Pylxfe߆M.hii:l~O~7X|,)>ǹ\hwe]15Eft]w:>?K4S f?+AeWmۺׂ3zRz~_j8]$ff9U"0Qln%9 ,ӽuE*I>u}$)lKTژ奧tE|uhER*#8m1L FtIVv.<dNؑu5n_}L}Gт+d`qd\P| Ns.4лsݺ\p464<#sm˃w&Xi?ބnϳdW@ot{ &a*f~*PU1U7!Xh-yF:H?P[L[7*{q|y#c ({̋LZU+_ڼb6%,ZU7 @b6 lEh ]iWLO?5H%g7V)^Z26jγ{:)נרvfO Ķ^5*;L|[;fF@e.~yg(fElߢWUֲaiuIukURe#hm}mO*zV6ijbt%Ik8HZG>funHs;Gu4Y ;H(򹖜N)bxЄ7qe~ǹ` 3l:kf̝jceS *al Q7ʹ%u3FYimgmyvybܖ!=1"Wk,,y Gs- V]>w6if|u#UWe5&Dbn|#%"kI!KM_BrA@c ~`V ކm>ޥt"fB xYHJt_< o| A"w`ǒރeoB5]/t>zM=tpN`8T{1o#a seW$l=&CSNV6+]y.}Y2V'miDzU\;PγtE9x $o@ZdmM䷱ N]I[׻ctcm+J%$lĆ_c.OIo-)S.:L z e45>e*ۖ2)s#G 0:f[>GQR.mӧQ&6<.֢&O5´?&ƏPs s,)I]㧇[FՔ.{n쾟Lٛ~<+j<)8fwe1g$Fꈳ `syy%L~nN>- 3^g"[X'sGL3Fq?)I$JLoF4-\,8\W7Nx7kvȬߖfSwO܇0s2ONi(1{ O-(`qީ-d;)}Sjg-Er]z0i0ڢJ"] hla8§*eXeP͒< 2]nяWE"9)+Ϋ'UeJU^Ӭ<$2.~28)h@7}Agw˩1LʯG >n'=qcٟiO`Qr+1L"GC>¡>VÙg:wU[EظB}7_P^q}\s{d\{` ތY+F:$\x#s)>7ƚ\S5*J z_j^4P$-ED/% J~4[bNi ׾2@ eZ30{-#cԀ'%Meޒ6 yNj]!!nx[b2_𠰫Lm.3:OY>(Y=]W\}דӮ]ճ֓9y-IS|Ӏ.&HΡu:/[s<ݽE2yݵz^5yoZ0,n6oj|ԲԜ|wkمLa"dm]*…ÜZ^ )M#Y/H}6ۛHB_%)R&S=9"%DQətOMUWudkɹvɤjGx%.*C,K=V7boZo;fOZ>v/g =YIguf[0lX,H#+g1%p6]Mɥa;V*^kW7&kF9QeZt|.n.3Ej 4}dYǏVlae#IuOۉ٧$|m(%&+]exB1!%Z5iUMv\Pd(SL,'1B*7ah٤mm:E4 Kvվ簵wemnUMHi8O߸ntGtt`/>mzѿS{m"87#5{73_K6f|brC{#$9>nC.qBg?Gv aE=_Nȱl:&`cygס]cW,OUz0cV#P>`=,e>A!lZR<"zťR:]a)hL!L i?cK CԤR%Pn5eG()㒉 vtJMrdh2%x~[%U9X]`,+Դvȥ+[?ԮI4s?.i{(JW W+}6_^}U\pK-NVWѐh2MKZ'JFvލg_ݛ]^\v}věU`f2P-'sy?=Vkڮ8zioIA~5ʀG*q F:oF4#Ye$1 G~" *U|4\-=ޟp7*ݣ.iԦgԣN~EZHXm4֗^QF  M 㦚x^_;~|_?ɛNÏoh Q`Z5N<4e _ڊCKm6zN"Sm>røŇ 6fa$/_ $tExS6WX*[A&'4mR.=8v{VXFdGA M>e'>YF;h~;{FRNI9ɍ 22(A om BFJq}ښ =kC/\#>I5Xm%5'+o,T)D!Iء]P{t'51C[gHu, AeF 8˓6BL$9sYF˸VZ42{g59zn`40ː׳[R>]b}, {S_hMvz7MDu#oOїaǧ?H[ߖ|nIC5C5o9$;>* YW7η\XBVQSe40 cg?; 'KNFd+#f)ͣ+%<"9 ^FAbAH7Cʒe tjyEr(t`\fetC{ r_'cywB7wClvJc'E{k~Qgcp< Mf,u HߵJ\)짐m,W٣wV'˹&㖻$lWȂu6D3 0'iB- U̵,MgY/@ǣ47%\,|u]zvEgYɲ40*9ǵUkUo49.ᡯܠ]+%-%<ץ՞+- yGЌq+)oɌd-\~2HyIp5 ?Fu/^&aDL_a>z5w^r2+DXS9Π2Y*pgzEW^3[\r)P/=`R #p$#1uv!IJ7-O&U$,YXdNnjA$"rw$0/2(g.,gN˒֢PSor̲Tr!zG@BJEDQ17FI7̃GWh;p4.F kr4D=9\d;ҎX gĠH$wc Z[ s!-+ L?LibD4F+(/c3gue]κ0ދH_|:@u9+'^@b@y0?F淋mޓ_9i%ҚW J ^냅QOH j] H#+dUoe~̠<3X3 7]iKff.8FCRVm-Q&ęz_.c6KϞXJ M謁Em.,dks*2Bm>tMpQڇv+L&N_o/n@Z<rǷsb{:}7MEڶNmp\&"0)z䖖p\$.ri}g29r}id }-ܖ;?O9kSNq)OR.w1^Da@oxӥVDd \⚄"ېTFI/bA<$$7FHE ݫa.t6h:׶bYre]u=uk"Zk%[__mpVݐ,KC,Z}}w;'cR-6ܩd& ^5őҪVޜַIQɑϳR3T#, ՓT&fEt)bLk8W2)5ck٬F){хqCu! e U.gTmzY\4;Ymn5yz.`02O/@cH3EBH|+in#91ۇ&k{^Č_PJ (6I4TOĐ"P[Wf}%LP -1Mb`bNc"^6gˋ6:=0Q KB*B@M&ϣBa x"^"J9/.$Gs!*kKH#e'c"3"2 $75Q\p^?*(n3 N:I$m%lBk¥wׅ8^5*jU"N9jl袑 p]<fc`v)`Gsr";/3\~,3|Tb@7hdqq,DK VH xr:GÑeH|8 a?I2 IB'/o6cUx_/wh(h?^7+}-p奜1 +M Z^9l%)a D99H2H\ F Ψª>1\-B$.}m@C 5A)΄DU s,PK_,֝8 uIJ=5n= nG\9/x:4Dopi09ʼn#*[)B!=kJTƄ:)6;-"Sd͟hIJ-IES28ԍ v͑iO~P gv_ q1N%  Uj4pEm+'AQq:3pl.3:C*3(`>$ ,HҸ6QS(|i_LG懼,-P,p'8,9|J㛾'J*Сl;tt$tVfoz43p4a?TZ84.gyQ%!ߎ"E gD C=%*Z 3:K8Q2:z. %&54rg2x`O#s.CxŽv(־'|g?1#!u,;ZW+O?z:mozGzhu.qIzaXObMQ8IePYǫ༬ ViEŢTDI(:ʵS 8-i|wjc,2XBeV:\0j`[ #Z'Fb 2jjEC.䌟o=O-/Rt.E* 1BD등oH^~&wV$jRV@ce W `d?;$ZTD*Y$+5ŸhK*GiЈ̧ 3l2`)k n8.D2#ej Xwv$qW7|}~}ӿvn?iՓc*{;Y;(/wpѶ%pL40f3Aqf"RU&  SN3|2E%* [j#]@ ֚"X{S{΄ܢP%Ola .֝g9Ʃł\<{Y2v,cY %!sv4Y0 j; nuӄzIhMhFP!Tsj?+\CZ ow]R!pty14 ;mHsӈ,&ApeUy|l8C^h-Lgf)v0*\ q20-{/ Z<B)c$pP8q6>Vs. 84i (ugG}OV".0{SW\.5\vQ~ٛG{?/'nQ@a}rήB4N?u:SM~kI\UG'$"9(qmw5ϻq㧼pO8F6˞xVwm%Ҕh^_&߼Y N?(pNYf o&~}2'ŸQyl ʘ{;6N|Lz>w? lg]P?d쌤C{f:j)yfSKZ/ߖgxkoM2b#@@p#dR \@Sgmuq|{}*)}'%lJi}Oԣ"ds0փA5 :N\(9z9 @SP‚7b*U:V?ā.<555w56%8kA)9R*qEmq%,]C- [qU+z= %Cߒ;QM>G>v4 }{1)zN=١2k0D7wI !B,dlHU&)NC@qu!Ϳi %p#F.nĖe b'e099¥vf!c!-a#ſ c}Fޟ ʿNNv&_/-{);oBHf۹VN.ߝdN0;<`>; m# V'ܱ?L4kyucjYһ^%nwmasm,2`NmK,BpN˝{4Bz_D$-Lo -h>|Ľ%gtᰶ=t6f[swh )lAE^b}Q$͡lzx4$% /ǻřfOwOWWu*&:\mqC8|>R9›y>͙Z=oVoY:H'(#6RP+w.h*]0R)~@uS*mTnmx5.#T HOr9`{ m#7{eX^8^ve퓸2 (6ă, &IDGBaشuv~tvK3BVgr"ȾJ^/g¹|B6#=CO7Rաn%>ϏxoշVf_t8lQ~#H!APB0O=JBĠ@O^Q\b%q9 Uc#&tT"Q$HBydҍb\(lRDƆD(oJ˸BfcT6O _sU_E%n+o7z1n{;ӁC ^.U2T/@N\dBLrFU HhaQ8ez^ q,tJ`U Jn)`29ccu#J9NzS;㩼N$x养\gfruwrsz]ǽ~' p&)HpY5,s4$_/IjbDD@Pc"^&t5cˋVZ>0QT;-(:0LG"EDD IQ]&(^M uSpjDpĚHH.8!(Q{@VQC4Vs RZKi-^4(!VXVE'꛼"Zw#lqV/%x[|b)`qW+\@uC2:N혹:^Sy"'?d T*TG#u` `A $ZZFb<5SޱZfҪo3D4U|{s0hP&M-a/6cUx2q^(-=p\tHJf)Kw=37j l3=<5 ]Uf \tseuLYUDBNIC; +'𡳦HfWbgT" S!9D)g,ݝcd7ggٿZhwUQ rP 5>8\MB5p+SocA?çT~obX\/:ub"U04B RA5/q<;G?5sfpxU}\<7CgeN֝Ilbď;V7~_M/]f F-惹_^^M֖.:Aqu|q{R $?111˛mAşp*f)>f ].ٽ7Mڧ2I=n ?qF##-cg:m3L4NЩ=Ǥnr+܎Bv?sǏ_?󻏔o_D'\q$RԳg @?gwaM=7{4װˣu.j"rs ٮ܊8S$ŃU'?i!5'g`1LWpH7O{fBQ}}TTP>䃘rGAu:$11md%m~BcdS %)(,!:Ӛc2&4`|H"IgO=m"3\#Cz%mx +|@e{ãC>f9(|I˼=vIkc{CG(&A/4R4 5-ph<S\kyJ)B~VDP <1͜EIAId_ᆏHXOqbnl=9);%(at\ JJM$khsH)Q P)1ZuWgYP]ؙv w}8A9{3vYf~Abe9tV\>!OKzx=ȅZbPP^&|g?1#>>Y+X3|Xp^M7]eoޑdi%BGl q(f"8/ rQ(Fے*mISI(k,1y,!²L+B.5Vv@x୎LG#1k Aznl $ $2A3⩣FksN_t%ƆXЧd |FAm0EBsh+X^yRBdfY"VG 11%zB3-m8r]l+ 6XÕ(LZ~ $$R8Амu?OqdΎZՓc {;Y*|PtOCn2.)cIT$ Rb,E\lL8P4JPvh(R9jj$q38r&*ybƆYG8\XjM)&V /=X49vm>[[CNCbtsńiާvD}auAgUuTh ]‰w3C29-yuG{EopyZXEf./M3"Bnj@yE>S>=z_;!͝ s̶jC5iϬfX>Va֮M_5lk%%^MK7'I7TA҇*&h1t&(ř@RL ːJU sr_Ϛ,[Y'y'%0ppII [l7֏([0?6oj鲞.+RfK#5ͺ:?2aDq`XZWS͖Uz5!v(6GԦasmx}A6$mqw7>-;%.fo9b[-d5n, -h1@T|,qoqYmݰo؞g:S]MyHSﵙU㶃3lvzm2 gvu,TZyM}=}MWn&CS|%Elk@ya(lOG N~tWn;~t%)੥kodp"D HP1#g7P3jC1+䡰nUl}M(Atꤝ4duo^j/eK%oυYSSOQQ@vi\mv͇A /3'Mu|qw B3)qX|({:gC~,&cLFW㈍R4DhA L1&  Hg4Z:^RwNhf~@.t,0ͶN|g C7r掠G!Y2'Y4 ] %l)BV'%>poG Em.Mg4?5nifbS9{S̆'^o43;G2ѧa *} uKͼӇ݃HV:0NSiTT Y}1.NCU¡pN~5 4vkB)e䰜('Y@'%B`}|7%(Jǻr˦|%=&֏0vC݃MyξU PE&2J#Ja2˧ht:[PL(3N0פETJzxy *uP)lxբܭx6.7r`ǰљ pumvgS$& &7>E6qMo|!V6W'SЙ&|t2jkr" *OMYrAV]c974/\dunx;J9jnWP[G[mIȫU &,O82}4~0}'`0 'dBbH}!1@&bf.|`˳vxI JU"\(ѐ2%dQ 6jB tUb4wa߄/U{A:,gL5*eM75IB jH1`7"9Ѕ,TMV515Yk#(E r3c-+(Iix0wTۻj#%#OVGTqKGŭ=|v.q[- yHܢә!UDE0YkäexYw8Tm  1\لo x^6Ɛ0Ηj͙.+܎^7<{TzuWoXt96n|ȁ^4=ΜZc#rf@ȃ@Y )be?Q\\E03n" XSL(7@'5pt[I䁳Ȍ4"hBI{qeSt`p!&;Ɲugtդ8^vjC!QnFVzҏt|/XKn!on"wI}1Yl6聕#.AhvMrmp$"%C$4VEf:D 䤠\$zj5Z!xKΒ uB!Tq<'t+ugǼo!HՔ#Ҕ, &cCR=rLBa2i둣.ܕ>O`_P:,MhB~:/%*#2 #Vc3bš}&N>čNݩ@L0ɗa:vL:BG\B303HØ%K>ܢd{ɧUǎHKNֱ"?u-ָ_*[A5xp0̱G#x(7m:bV 2m@/ o4g;t*5}[>B{Y IX ,;Ô6FOX"gNxKgRT+o0hcR<qk|7ߔl cZe?ere /x܉U=fV\9-,0Sg;xԪD#W2$uHN1٠%ڣJN܍.K]rGw3s@Z3_tߵ1xw'k୻V@ 2ԏ,CCeDRg. r%jZ B#NupA[cdȅDT\"טځɓdKblCˡc*"=pwvL2H/,mjU* x ډJwd8itO E~:pu\:9Vtt tdh舦hKpd){5g>%P 2\*HyG[Q˨*XϷRB]=j0YRTFk48ʒ-t:g d1tSg̈́V{ڡX+<>_OJ`|qX6,m0Ww g@d9 h;,lRT^F6L;]VWk4%yDQlQyu)/56 R M!z"͂|R).T>J1T( Y(X;km"C CFK;݁@͆>H%wKpd{:6ODb}||Ty iG/|<)f}puy!fqE'޽λ\Ҿi6Wcj|~Z&ޓ~#?]-r3OB$5BY:EEZ!Xb XAS̮[ U.Fǹ&{@SzRm6s*G;wk8x4 sVx>Ŝlݯac FW)?|?}d{yN'2G~ӯ?׷?~?}υ}/o޿GqD#0Xv݅Oi7޴]5M-vivy׋]TZΜ~\M3]﷫?Oɜ=lAYs^Rɽ ub)՞Y+nŋ43!V+ 1iL <~-&sYkv[ummπF;IQyɍ "2`tEiL M"tg>'DO"yfu2ڰyL}Sai o<.hky/VE﹃RsG  J Ź .x_kt3ۺ=db#v35roVk kv~`}v^M7qU_:L=q9-)h\`;bJFyN7V& 5fRTԃQҬւudRJK<؄!:s9cKjO& uqG; o[od^ؼ,e8>b"6x$XρJG>φ48Yq0cvd p|NNtRj:h0.&@)7D3 ^ ;@0hЕ`^-bXN`0>:vɶ"!-qWtti>AQP#iXP4!09WI*ʝЧ?\&&\1>.K -&@'3ik_qaľngwYAٻFr#W؇pس7hQ4)EIkIqT!_&_,sqQq!=W 嬇\f!L(馠O0%kS:\J Jm oW%J9""Zc`)$pdpNɖYP1E熄":;Ik^ g"ɍ-27QCv[Db\˒;.jsܟJDk1`gY :!(o_ sy\K 5(d1N>VHZʻ%&;;+ဃ6@^=7GH Dh2-ܪf4f 1A\Y`sB&%նfl;5vr]x.T-BWՅMo!qfM;j|r?IsV A2}4Ncp&ڌI'M9~IVWA"vtz1/u$F!EQ#Q(ƒd1P؄%gi;5v a͸<8:!6TUƆ抌Wm\$ZR," a:,gY5DQ=q6iT'c=18c*2P+}6K,19 ]Jd6 ee\ kؾ$N^S޴Phlu'<͟Ӧ7gL޻cV`#+if]JfYf"gBxueebU0ڲZJIÓ%'JDR(٘'@e3Ì@ѐ8˘2^e Hl,[|pT}we)ȑGLɅ[CTP`s&` ܲKƓs|J}\2%\#9s`p~҄&!8\ui@j;z}`VN ëG«FkJhVLռ-夑i-*J%ns^g8RNAV^y@j 佻\,"%i|F 7af&&Bׄ P nF+vq'!(Qy 1]ȨMdN-p!m2l;{_ nF~?L'uLdv)9)(V2zP%E` t ,I0$C,uw--z A:I9!I< ^ZMz Fr-8aB!d Q"]b9F7 dN3m]L h r.%25n |\j&]Ɣ0IT0 '<"f[Fe̒KNd>EHM;;!żuY~{ѷn@[KΠx$w`XXWg f$X9f\0*J*Ùz ݞK]]]A8XicK̜*Eb%:y$j?o[ḤV[CѰ.#E:VDc彴Vy˕7g?O~2 ,&ɐk+r4i)`ӆ2j%mHt ʩ_ (t,!S7Td.!W`p:EEgk8T݌G fܿw]liozBwTvz9+wPtwpѶG$\B'# UN,5VwvKFx<MsӁ7+_[wM1#m+:H3Ţ|2" ǖ(ʒ*WZT)G.V,&A DQ&N,^jymv]bpuz> Ӑs:[#\5Ƹh0Ғ5Z X4d'CQ~=t(ON桐*2.){E9:UsB'9NwԮ ='%MzR},l0,5hBV*Vqs^[ޙmoϊ*tUeNnBN{7A~0Jszw|yKbhC4Ͽ޻ C7y?;z&~H4 oHʈ:Hn,xs{)ڶ@ 4g!" Bm',|3mX輪qXhM6M EJ1T pmGLdjFCr3 ou46'8nF.dvVgjx]_ zIWuKG_rH^3-!* ;Wl! *yS X ,V1DQL?I,*ߝy <<4< j9|r,H.$1ys49ZuKԁLeNtzC:jB3fc" vᄞ偉 [`Ls<E>G>F:zCz4Yr0|v .pZ}/ZI5>Hr2JlÉ֛lq\o`8;"lve]씾c[[hp:?]l;uƷOr;_M8eyIlbseJGѫ*/=\Hfd!XЙ>ɳ(Iytgs$]_E Fnb.:!*.#J!@ %7|=Qo[NEkH9O*sP}:C̴CgfulsLn&˄;CدWm$[ RLيмlZY?.OR/):+CTEQk1{r|i援xz a؛ ]5&y|;[*|~}ؖkoKTRqe\ f'Gu' ru~dB'z.?\WNYJwuꇖԶas?b{}NڶL =nc>:-wJZ tϋy\BfQc0]kO]ɕ+W|BF3(i%/Zzڨ10\؉gչCm-sTZ{:nZYVsczu|2W_=ڰGMx\hGizmjk&ה[JYd26չ3/UT{sf_e_vͅ.1ʎ%qx; s`\>= κw Jil%(m9ؒ$UeAAd%mJhù„ܠކ=6q 3Ҧ'UOzqgJ/r5/FK}0dáXDi8;=9l3,Sj -!{AQ4 >͆г]A'4<턖pǶ(LTʾƙp^\7+?xKXfG=gg1Uj0!@믽M׀8lkоyd7xqyCࡿǧc_a:/l߷AѺK:`*Z`~8i98BpRv]a,'׃x0ъnmGXD rs6MfƬBИEk"Fe<(T<x77C8 V6^ g+X_c1{+byv@ 5`]EvwibVz!.=Փx8,ώO/ Ugu ^vS"AƊ?\̉x3(ķ;|3jf5/2o2p;=uQp,X]"Tu} T{LLj#@Ql szߣY&Ljk]şo4JyKgβ3O1r<]~'gSQo>GJh,*IY-RI[i2jSޏ:}H#?:>vN?Vzҫ? G~܄OkQf!l%ɉѪ8-%h&(Aͥ9$TZDʘ6eq1rѪPD1wO54J"7[Xqmʓ39v6Z4m?1W&r>/C4KKRںfQQT,N`R: JU$RK#=dBؔ LmSVFG2eQ}h9(+--H}}kdYZHk Ӝv6Ф4IdPQɔ #0rɷ)Wt4Fѱ]ˡ)kdM4+kǵ[d!= [Z]FL /0zw8ƕlX,O@ɐ>% Ve@b67'C}TeDM*1ڜ18'*JIEf-@2{gm.U[ω9Hк1DDk091¯P#Z)|!-8m43gsK|mOJ=k]:gIhCR [AioeM1j frYdL+NڲRȦPB@*!+ե$d?JRP/q*`!H9gjF[Q`QJ}ZCۆuqx(fM&JR-,jl7%dp-M̍@KDjcl5i*X đΊqi 24.T+hTRTm.3*p AY<90Eiޜ7Phgo\i%EՂAPlQ<,>3؆vn!Vor3E ] * $(#2Rِ6C&Ծe x%0gs,Ź6Xُ^u L+h#1vlPAޠ|ͩ~W0c2YoB9kŘY cGPAA. DU9%;3X@te ` I&- p]BP LF0J892jdP5HG, RH6Y'J@!O)A:ͨlB)vNI2Z#qݳ. PD˚}1QK`#/UT%PH/%i $E4HH( 6jer $z aΡ },|:omBi"34ȧ}rnRYbL:;pB0@ߛ3"wض12aҭï-eMے11- HМ6F: ^#x:p4G&[АJF%J-G @˥{wS yE n:X h`!Z<%EPiDiPyU(|p|RT+W0,9Ѽ8{*ċ@C%AYZ{[{ãfP܆`m@]C7 UfeSU_ }qgD%H 5Yr&貚 I";> ƷۇŪamby\E4Χ,}F_b=^EdB.`Ajz1΁6D. -r!P-ft ((BQ{CHU$\PtBtX:"t kTg]I`SR'm>Y9(=ZMV7$t-^;=$@>)YE>Ǯt6 3+H̔HRd'J!2@!~Ѓ\@ CCX("FhZ!X(ga'D D%M@r")5Z'`3,vNj$$,Ne36R*4[Tm2XP@ JX1LP_۴ײW *LCbziC&4C?hk<󼍛^#<aM<8>[̵]D/nfV48z:6tF4ೣɠ̢qզYcV^kA#y(Y-:p4vf caD˟M o9+-Ps8n𠗨5ʡM!Ts ^!7Fh-<7N:(Qr]gT*"H` H H**mz x""0fA= y^æP@>׬&bE]Q("VR2P$@᪝r JedyA ahaO1;JuY1Ԕ*.FB,z@U hH?x.ڈr!cΩ X;uMZifz`JQZ^NFR !W(-LB欭OB?uNNu~9(JS>>:I"oj+q'O 6Pڀ@t…YX)*9 X&] -L9Ezʨ}!1?MnD*s#<5gc)k`Uj4N~ }N1 `F"( q E WrhAH!M*?"u-+yG[DHz+a(^BԎ/KW֡J& 8!de`Xr6n㏗ߡA9U2tn]HŰQ=3>PByCNʝqp8;zʱ98>z@Զ] c/5iqÿNa_|w Gh^]!`va/?,~^,N7jPT14 !"7Ӗ Plb(6@ Plb(6@ Plb(6@ Plb(6@ Plb(6@ PlbkBa Pع1@p;cT P($Cpb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v=_'$ᆰ3N 埼QR`'st@R@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; |@Bn֟Z2tq쮓 GGM]5¸pqGo 0{`G ;Bc}Ϳ \g}NOێ\W>>Z5!K7 ţ7e`b=¼t߼Vo<^ #[GQyXrpr6JZhfLQ{1(*EPT{y^zvr(Ȝz4j{p}C:YJṓg M n1 }quPD!Gyکc1 #@;cœ7 #J%, JHIlf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,fcrihiQ_jگz\^_//_/ \w,W&%%C%3ƥR֙n\Q6.= g]A!BUUʧW=J'!\P*+pգ}l.pQ:p,ʘK,h4n6%|彧t(CKk4uHd.3tLmq .faͻEHV Ϣ+â06Z!dމ"`iZmWy-l-xkaa> "=K _Oah{xZ뉨z;5p5߭;Z^q9=%K9( aq-w>P%Rp&Q<1"1iO@`#De݆),@}QaƎǎ XyռhPj$`NpเM6|gTKJ4w~u[۝йoSJvCL㱎Q0EADKZ\~!rFBB,ZXJL!0" vc#rb~ѠiGzsV1~v )p'[7| DQKą1b[D2ڏ;N 72D[kjȑ.Q 9ہqBPGz$j*Qg6âk3>OeIυss/w n&Kk%]0&QV\|+_kݵ_ߖoO&G_0ׂXIER)gpyz-4!Y+h2Ўy0ud4r~/7٢t 0تE~v́4r)6OA܏W/r47PkצFޝR/ɝeO]y*k@z9O3ρs$K_z.q׀KvWth|ߘdy @/ZG+919)\$IEFU"YV%5GNnjJj6rh8Ka0NF{ \(vϷNfK7KUc9ߎ6Q2E˯]MkrRZS nѼK}Bv²WaGF^ciK ZRHR$z+"lU2HԝQ2f#gՍ( 0x,de!d9E&W`bmnֶ)m&}Exy3WW.AXNu% jSZ@*Fʼn9)5kby^,KXK(IbdKuӘD{ACJGt`R0'UfK\ζ:EjNjwvӊ7p4<\H☁   D9d\0GJ(1a6rZ&x,1xD%"$b'sEڷhS}i=h{gYWQ[-/3sv s$;#G ݑqŬEpLbT\Ǥ*{- `z/ PL; q#wcGQ )~Bs%mDUt$ B `gj#IŤ~8e:(Jɛo'GllǓS|LO.G3TcE$31)S Td. o'$VTY96cL4sW`+(r@EDXac 2 R*8NΑXgDeHVVq&?s{` )L5>:m5_Tk*$?4̛އÛ=+*AH0gX(b{wd{; d@4;1U.< ]L~+aP2qQZrMu\ wjpy)PU1s Mm;Hݢ~m}WK7o.K&X`TëS+rG'1Job>_>qǟ0,?Viں4O&FX~kTx{59^x sb.͗~fn^풑W_/O c.ۆ!Ða{`Za-`,]|Xng]-_'2[;*AG=dۨmϊ5U:3IsX>?.u}U08'7lNA,}S6t*`Mt/=r+O?o??}zOOݧ?0`S8i%AMo{!Ckkho64UlUΧNq9eܻl[+}7rkc/oK}j*9'?ik6!旍aͲQTx*,ՂܥE"AL5#>}Mth%FhM<,mW$XXaK=JQR: h[z@gcdXD֢S0yE>_0GZ(I@{Q5A&0AqT .Xeփi5GP ѱ$ʶ&0 [yQuR_:`J':{vgJAB /lPqx.0+z2+ᣋ)aBAc8B3*y)g.aJalJ#68R$;HysQ P.r.HBmOP@zW~tX0]=:%\pZqů{ءt䤟 6Q aӥiVBq 13!2$KE8*1taE$Y%4A0ɵ r\ Ø8P=x:FY^1Tk%iNxfY`*Ya.w[}yk)oC/p@(hٛ@gCm9V \>]CH۫oE9)[,>#+aG.4_c} C=i;O-a/ϓBf_ϜugRӶv;w'!VtFǞ*KhR|vV2Lk,ɣbf aN$1[' 8) AH$4,XG;Ze:ҬUJkC!nH`DI}d2DaΨ@T[T$ȂJwP1G EmȰ, Ƹ#aREb$d>Fs Ŋ~%yĆ3H[Nwio! ,EvUi҈=調wH ,L/^ynL J-Ѣǂ Ѣw1!Wu౺z%}>-}n[lYSژq륧tE\:Mc"Z)JP 6&,X,Xw+0`>/*X\y+W[zF{ g`}}Mzg62M9 dsGaH($Q -9+R 3Cޔ!7q丳(}|& 3!l: !";Wq=DeK . k_cMs%vna[ر<DAd'N#KU,"5LlB*Caj3jX$|2b=6ѦDd6e#gtVl~ ad|[#Ėleǩvv1V;t8^_u[Meݚ3A6.{Ѥ.ݝ齝HZ5tcTLBN^Ȁ.&n?.֛,-l)ѷgZвݻ4z^44eZn^a{Cϼ˚q<`[:[0}ǥސ{r4g?kkR,<_-Zί9I6ҡuts¸|IQ@/'4KK2M2zQJE~Q g^K>Rn|#ޔgI{F+Bi}ȇlXLҘ/uZ˒cN}E,JLYqbTzˌx2 F,rfֱJ wf=m:WYxyEN[ָ XÏ}TT^t;z%9ɦ!%Kn&ty׿^4)#$c r x9bfJXyo:so+ Yt9\&DeD1Hy"coѓ@= ?= &Y0K}!p|yKb¯R.pEi?ufM^NF AIQ~ pDKgɐLbgX!`>)"g7T4;3QYxV VE-6ZrPJpNpn=0( ŗ>4>nj_΍əH>"moߊr.tqq>9(kTK?ޚ>ӈQ:{ȉ.puFMnfžѝ 殺L?xw7WVФQ̵~;9[gg;$ ?\n? 2*VG_u2zR<|a0`:$ŇCyУٚx&!E+W%h&V+Rv(i$-}6uiQ c's|;I&|< -?/ISKzT:K&w]uǦ\2Y~芩 /?~K5ey&}JIM{IJz; d~=M֧*Z=j{#ҦS=zU[jM=(e=-$6TR/qes]86DTbRYlP&qx󊐭ڝmNW,T;HeU̓w[ڵ/2+壋4WCÄŀ9q^z B* ʹH$t%5' b*C xF: lp6H6 vVk/;^s%h)*۰85-1#1xO1gn:_ ]ٝsW߯ccU:+Ghԡ±Վ"xoѰJ(GET1K%.Вx  sHg[E‰uXT%=EK][甶dD]S薞YlBE3AUp:ȥ 3jg>MLj9+ZXp4{=T9z|0U^iH{ s >6tR , %dY6rˎW?y7 O5ݕ mt[xIVŪe3O1y*Lϳ WXl*T5Uqn^yBe?9;a qIbNd,p)CyFHhzCz~^c"9ŝNlO)p;Ҽ#Pim (rQT;7 q$U S9&r6ȹMEw9lVJHH646ا1p6;Cg fdkTؔED~T)D-R*J^@J]Ȝ] ~,l[M@Nu i%T'h> ʶu,OϹDϑٻ'8LxX X^r؁D) !5d(O+LS,B\9s:tRfRLicR.bJ">:Mc"Z)JG3IPB pWniƳ`x>9$༻Ϝ!@8W R;&&8 xs,(pA,@2Fϴ,sJÃ&Vz?sZp?+x?1w(mUTRs`_h|j@YXpX4cXk2MmԘ(QSY4P$-ED/% J$?vY t`d1&`i ׾Y 8&:o;cƌLb$%#cm*jS3@}›_Bp`|k̂J}bo6{|kg]|^cFq+s&U.p7|N;WeHg\ hB^!71IZ?uJ b񪽞0:l2zpцIc64O#3Zuh͓W5fRy|#= 0|uE,)eϥsv5CvOX=mY\r6YSϑjS#E`9˭dk0xY~0GkNq;N䷫&*QG42#!&3Y&#aJ-uzt:yeH"YNո~^^&ꚓ٣ӗB}n]Xo;$"pٗ =䣸[^v5wo]v) wA1`S[\zueτڹ^9T}H\J0snnd =!"}Jehp %J7bi1g;JbjPHt|K 9|?=uk8 iJΉ`fwk5DɛU,fS Jqe5]wK+(%̖0GKuKtu~`j♉bji./ +;Inj+ڴܳ,ِm%.=1-;&.o9nq1#2ſ"|v':3LGZ+ 2 NKX[7T 3HEZrUq@bjLN'1~\2Ic<Ԍ+wUu վvվ+eh]}loЛt hPZXmsw`[HL411'8"FyY x,N!-'-EsD7LGݯp|Pr񗱛4S7| c9ak2h(rjD.&TfH N0+D{!CJGt`\I{ӹ\Gl;Fpy(M;Y]t(q܀nsA"f Pˤr#0}%) @U@yށȅ4*D 1( k|@N@@>rDA~6jP1~l<3";D\4(Xm FRRsDK6:Z L"@fr!YQ+ b: ES, 0!D҄Ij$fX2#b6qZ/LuHlZ).̸(:\p!ks8Eƍb`ȶYUՕ d V/zcgU8S~$1fSaْ@KLދZ.ip+QA?>vpΓv+i#@dH A,`F„fQD2)j%wڨh4D{Ql=8[B7zf0 ^И;%NŔ+\>y f'W3Vxyz#>08[SsH6W rW%0&*D|gK nK p,9,Jc?3FYNcwIЁ@H; 2g#IBs\*R"!EL@ \JY,4 !I%Fj*e;Q!(8E N< ;Ӣ=NW~}Yn(›Hn $9xS0^ܲg:+vxNn3`U W)a(mi=s1N45 jPK"lVrE"1(&y$DJ;f`R$m ksf H>MDMH`=A++"8$^(p %uŗ@6  B1Tzd&LcGdTaXK"2-3"B6xјvMKc5_k@%vU)F92p1"1iO@`#DrĴ mDᓈ `/TQp(PmYZ[ƶRpE2K)ȩxnGj?C J'|pPAIIK:y Gfƿ^Gh1\G_X^7QN\͞}V*Sig+ql'%ѷ)0$d)  !g}g-;93'/58P[\TYһɼap?F7 )pn p6L΋ׯU-2"z4{(^\oTDFucghc\Դ%2;]/ \ӣ$.9J2rp䬃gW; FRUWc+VlWIJJ:zp(9Jr~,pU cW\]NJZʉ5 //kP:B9ptJW SkNW .fޖF\ARAA!+-(yQ9oTݸW5Mo)%|e&˪- Uo\UBYJ%/5ӨXIJޜEqEw$ X-uT0TH 3KkX2tie$Lӣ$ -W!㤤 _CR23P bP5UrT a/0 K'zO<|1Bay\e()HϘ[չLOFe1лoӅ9g1]`jߨ3t'_~ئ]' f˭Ҁ`ݚݼجg[%68D2} lf0v"g4`[*$TɂĔ#"S/i{p0ѣ -+--3ԋGgZ'o7̲Ƀv~O`(}4}bɌj'ty-AkɚĒE* jK ^Ȍ ^`A Ɯ Ӣ}%K fӌMeBaNp.QzuJ.ɒ',񗱛4SGl@˩[1DM|ЀDS gz8_!/7y: pN/k}Hl+翟]␔48dꞪꯪBO  s^,5\lـ88ju$LP =Mb`" r1t6rvkl7)Z6hֆVkv'q -$)G1^}8B1hQQoRYﴠ2"C hE{,2%GQu!@4Y16E#OՈf(Zjĭ)Kr3ιhh AN ТC3ʈv%NP'ժHhG$Ag\p^CQR=C(n/֬k#gڡ^3j6.y^T5EV/nj!bSam}d65OMG1Q.*"!eQqm8!*!͢\ݝ3KI4+ĕ Azr,|yHHh\hP=QS޽]z3Ah1U4D #Igi $A`Czܙd Tvx:E --Xi#1Hu"r<fr3ÿՏ/  MfFix}: N `yVALWW%aH|cnU m|T6 ŕ?5؄ֿw$jR61\)0-}xX`,o區$ZTD*rVU/8&BRD #rV29?AJSĩv!ڄd #3Y`jeDk畜gweFi-f1mZn|(jfrKB>I/xVR,BI53Aqf_K)*N>z*Rhmd4Jc,EOxI=)Z3%G+1$Oꩍg9 TRs6!wl)'"vEkYɲ@@l_y˚xj и,uї#3`HEzXË4Z-A+UVPf]7Q($-Z$lBJ2Lr'(iB- EKc,`YyeoEycY;yL$MJYDJa uhNzNT$E|ъegX*uLh73%#SJAi[Hڀfr9HEAJ;a#{ICH^*v" jM+H ғ |\P h>$} :ɉG*b,h@)$\LɩϲYWC韅7}o+]*( ~Ly:WI^̟[WA)*vm^ Nc!}_\ΨFC ;?B9agCrvSxo\lqo>-aԴNYӬug<e_~\x 6 bnn?Wӵf+NN6+;db bU?#T êaì2R? U̻߻/x9f;69N+GedIVZ5WjԴ}98rĹ?# u8g3*S<ǴnqwWȎAu~y~ӿ_>'w/?\I)UM?E~omX]Cx󡹆}uzD9q,[]rKco?O~NYy42 ܓ;a$fZ;Y%UOUz)&*D f#{̽8l}&>60Rr,81sC ()C&|bI/m뼢?qkg1Z1^>-ϴR dFF EA83r3aSnONh֟Z9G;rҋ븪_:R+N0L?:Ù\)NpV˦F*m7XgX\`u7QJ3BI!+X " -ZD)k񵅆\h88p2JdEΑXs`6mP ֔zv6DB-ME 1AxGUiWwq#=J.t 6, zXq-NF]?} 9 9;]Cٹ9q9I`hV$#5Vhx $R\ঈxb ')9#s*%=%U_rOaUzŋJzʝs:S(i% BID` ܙ!LD$4BhSxt*˘riAs~;K^(ru^~gKYm\Rwmѿ+ԙF9q'ļi&)1Z Ū9f$DZMeIbs.Xz?eolozGz93xv5GTٌ|cPD1 yY@ЖE4pVoVYJc",)$Qcu@h tT :\#z;q.@.; L*$ <#:TZĜS)6<XЯ$oUmq|io7mڛH{lL>Zy$K˃yC^q~8n4#7]I-VG$(H9wX# cC*-T<_q:y7b:_{[_|l E5αѮ>l[u$%+vp9l\ kU +(v?+5GE\cIIOb=b e/_,Aks[PS 4~)\iւzj`BU9/ș2[bƸW'WZwLHk7 (#$40.ptmࡘ: !4L]Xdp ^kfE4 h7Ր'9n=Jo£%o4fk\H[i aЕn$`_N3•L/?J7R DWnÉ<@m~|>U|&kB H@M(u0%CNW(6Q&_LPG'qt Jevko(i,%RM03Q MD;MM.x뜡Dy &SrE*c33T7dtW#uvˣv;x0?ͽG-}s$؝i4>;fwt=ۅz0}kktDn%m<ױLV m۶w{3$MCl]`ru1m{\xG6; u,;Ė,fܺۖAƻ;+{r;??~o9}x~ii{Ϫ.-ʬ]|<Ȳ͋^u{ %NCYvrk]q%**W_W>L}) .*C`K8R:oΞqviYӦ_Z+zˬK9ho/YMB󏹀.9ɢ"7sgzȑ_dG2e)J&BFl֚he%4ir1]N'uw*s)@dI3E-Iūwkdp;S[vJJc`Je jL!aN^F$1')<$R.> 6#%3TJd >A 0jmz2I&,ً ytX[2pqJQL+K"/&˨lHb I-9bsp 07(t+ l(D0cXBQTYclg$"eѕ udBVگY>zfl P=.p-*Yn>^u;!Z߮k^;Nc? `la?E:A(:Fy$x;4G{c^XAAgGI iVMFkbgQ+h.J1FUd8޿ZИ@9ScPx^bU$1!0o cM)91}Ț|r>v2fb7Е܅⚍ZO q{y½XSEfe=")`mNU֐|4Qao[KSj5qoKi2I/Ia y$J%Q=ͦs;߈^˶rs/~l mŎvb,ɲRm6#n2ScI  (bR $=owVfUؠcJ;>eVJxC4MNgֽ k-FՎHmB!%+d8Y_c7߬|4=vrL+{((rκ̦1X- 3X6if%V|a~tKzg/f,). *`jwJ$f, ˏ^9tDSZAz̬w!4vHMERsaA[+LdJR6yRt߿VQosm pR3;|\bTN$I=Y'Qc4FjH*$kV<> _i) %"1 UNIv`re̝ɥnUU cjTZDf77MNST yoT/f7UVeD0oFqf~igի cvч9I^xkd}juS8Ȇ@uLB#hZtf3:9 nyn/ZAJQ$1ʫeK̨ttQj_yC}DӾcz~2Q{?YHE_/޼Zꤒh_7ҝ!W8O'kKhޏG:6>>&Liz'^UN'_W =s|2z8-QWmm*=A+&zN˨Cy G]OMqS4Q39ч_~?A*}緬q+0J`ܣGM3oۻ_>WVCk -zා´|-^gf+|ޘH_.濌gޕ~n8]jEt%խ ɣ OdQ0|<-YKK'!׍ZlTao Tw'i]E++ʫ0 Z:$uQ{v.EF.3:FP1**Fm'Uކzc<+U-:֋ᷕPY%d6Xe3hi3پd7f 6k:>_>U\5/=Kn})1tc}9MڅZt> vt0tт4TCbh TE\J J3N۳SG'"omRz%V8gBGm6VhYuZ9D4 G{%BP;Ft$*A&(SK=vG]Vޑ[ !Ac>K^Ȃ3)ip8kQX?Kg<]4PtFBZGKwm+>M^uH1.ʲ^^Gh9_Wug zhtۼȽ^wl{wOmGR\t}z'ٷmIW}cESAMx2zrq5Ior [.&»3A=PZP/Og:urR8 ©Qw9&aЦSd vC ̾AJ6!LP \@%}R@RHLEբY0H&$eQ H[XH2JU.#tnwHWRzv8g]au$qC'#d}Y* 3AL71єxfE Sͻ/٫#t|A!N h%'43gu߶-h?Z- Eݿ 622!}'ӘQTTr>; a ]ItJ2ժΉ*ކa8^5y~ ңh~B_g̅a+m Fxe/gYT)rˍ6l t^B\lYZXءd$.,9SD>z51p 2A{T)&8SBP\R\tm2# i$A}Բ8[v~Mݝ-Af=>Jnϝy}j?`1b[@W97.jt巰ߋLCqJI:m~=ckKO%)-/ ..qz?JZ2VڕqR9HKZdI|R$߯%εzsd0~-WЅbHA {бmkysK1 ӽ8mwnsL> ~'C'㝏}V/juw2$,/AL~dCIIyPۻ47݄fjz1u6^">=է[^c~o*m'PN1elZ9RǢ<8ǢxLr Lj%ݔx0_hwv!,@T X0P:{I ˲T1:/MUP$JXJYrB&M&sJd̅\&Ξ9~:$/|u>NaCtl0)mw<̾7+n͔-[M_K(ZT&R4=O3ёE=7v/U14“h2i|HA5(KVEIa "RE`ˉʀBUYteSfȱZ+'skR7w1:HCo] RH{ȝ"8"YLHF7SGLF4ϛHc*8 Gvugϭ4sH 8 W{f\S "D C8r4~:޾ܤ4'ֶ-?-g Ee50NF>sz鱂*8qqhlVU8 4ӧ.}`g)7[ùDK WI#9z.h:˦h2c-bHFen"bQP .@ Ua[ziw,}OKJ}T/$ziěK+h9[VziH bVAVzi^ZVz9ƺk,l:_Õ%şˁ iEZ̿L/ 7xVطtNˆ[_d;f/.H.o$H 0 R>z1hG5^Fx@S 5 Q;j|GiwQ;j|G5wQ;j|G5wQs5WwbF~;jtG5Fw >&-]w%(%ˍK~\bbIڔIDܗNt<ӧ3YD3w)wܾX>QLyϦbUj&U:/^|h1im=#n9 YW5]ItJ2ժΉSg-m s)lJvz*`'Q|`Ȥ&Զ/Ǒ&4'WEͧq/oz{43R׋3}<|.gY F SYNq]:/!{]1j5j,&m: S<Ͼ4Z06ALë/WEWd wcwzE`;d/^g:9>%c~cPEf-յ|@փ<"gSriN`N-paB+jƝbQ ̦g0YwxZE4ғ"!ntu{?_o[n]wOj 0tJ+UtQ$SEriCK+IZC,=HeNЊ@ jt>*%)u2z3TJ5qZΌ[(V=7/dXi{ـg}ƐZ? *5蔴Y.e3l~ҌZwpq8AM2 %2!I(KQ Zgez,3[Fm *\ԇUszbfC4̐=ZȣR6ڂV[\#49dVUjRdbE >&R~ּ%IǤRO`-sB&>nN4礪nyr(_/^U-Zyi2i;ڝ9znF륱sC?b`&2"#L;a\U3g3tMa\IX.glBÍ k2)ښ8{}V*k͌Bu M_.ayQpsO~".hЅ_gϟ{4hi!r΀djDW%L4Qr <Պ*h^e#CP$M(0 h3v,OSLE*kj0=5Pvq(Z[5iNmm`) AYf, !cuݳR6Dp+ rEVtɒĢIh8 0kd,dC[F (CшC5uӈM#n`+, +cTf3EGf*7h栍6Aq1cI+}6KE3J&Nk!YBY,WQ@VM=Y>'z@zq"<$_gYPh+EbӋf<֗(='-t1"<{.5)uR*Y"z4zPaq(>L> Ⱦ* Bя^sc4/[ŕEZ?{F666T50l>ľQ,#' ЀMJUݕY_fe,v{D;̺.>~UXbBݶD-v2}@#PoixO~tuY-M™]o;q0oz#~_F(?`4FYay׾w'GHt̋R_p2訚m4`Pdm4$tyA&&H\2fZt Hib7~*c@*eBR>PŘhEIR$౒*}׬m8OXC74k dXoMUzr6:{@@=Z?VJ>NB:-+(sa_lSV&d{9]ʎr-ퟌRK(줊mI22JT9/lVd[43JK[@z6sBgp}qCjO?y%wnNlӥn }N,r./gm =)O !cq YZ6I۰ 舒T|?L*ޒlB v-K$1&ȥIVD߉3 gߊ^WB+_.vbVڮ,g,zËUYSF.^4a@ *5%F_Q$31Eg9+K<$:t$=Ǎjy^u? \dMqVbebt^ZX (c$ -ind}MXc\,iȒ֢Ӄ:W%ow:+(rRκ̦ Xk(*ڤ [WXa~0ٕBIbY:K9J.赩4S YV,N';14hBvafWZ;8saE[+L,dtդ:Iz )},̔9gϕ\K]@kTNz`tdhAWUWmyV}?bSU"1NiQڕr|miWgEC:^Vg!џߌN~\}ojo'_|kҥr1ů2͵_/f?I'yDΟU[MAcv8wQB0$kZm 1 PBh T )/{ V'{t_UU(V6.ҭEPN-}]񇺂hrQ`/?OC< |htGetZ%ҶhW]zflëU+oJ]=sxa-u FչIޏͩk=5.F~ٺ=//7.[b0'xR{ս]r&_./슷7&?h!V\iy޼6_de՗Y^,f`u?5'uSR|j|zAs6|vzjחSQU)'S'tjyM>l<}d\o^ui 6>gEa+}5֘Ŗ RG\~qOg$-O=p5"WU$+N@na+?qe\8#'Eo,s-{8gBG(R tty1>E7G0A0%Z_4 wGKC&>8Lx%d/p{y팱5^'\;ݷ«z<r]TPxSvTCKpG3~աy%n)vuG_iq>krW* O|w\j۬yF\!W7tܕ/*t^K^XϛnSt{󐫫6OZ|PYH[mp%>+Q*n~,I>l%}FH1W52+lЮa4WсbwjnsUE}QτUh)c1%dZD6KSd)"`Q§O z&ZݔvE碘K+ ])dzyhfVg&om4קɰƊj&@.lMJ'd8j<(\jQr7Q4B]2%Š(QILĘrntR-ostZ mQzU tQ!`҃)iX8&H.єFL}6-r c]wW#4e$+[;fUwew5|Z1ɧV:L>%l 1zL@ h[#"{dIGP3Q"b5G_AT=yuHI ..)jg):uPk@hѡͳp=n o3kl d6wPh lg#^Ǽ[Td 䜬ӟ$y}+ס *4;-qk*mPg< !:.>q"&+ n-OI~Ɠ8. ੳZ:K!d$KGTK  s^-5llـ@ceD *mGI LTN>&╮k0c MazGSX:}=>a'iμطdZ88F܁ C_+uRu AԻvSYﴠ2*C hE{rMK#r^.$G 27͇y+~)X[x,#Qt1)Kr(3ιhtJp9~$TF,uB iYh4%D*@8 $( XDqyOC(nϜ{Zmpfx.IzcyQ̋Ŏ)H7m LDE4k.U8MrĐ >/>/6kMC8>| { 4m5G> l2xsјQ'BB;I82̤7J0!^yi ,`NF$ EuB#ſ{ډ$5@w =j: E&aIN<*P3&gjLq Ε;~ u5s^W8-'?~^8OzdKʼ-:ŜCqiX3cTJC w뇩X2$~XpL5ُG('_Dc;+z< ^Ǿ)gגD ث7ZOHD35q;j_ _zZ\ RqCa>W 'V$R=Q7fWr!ov d"m̱!M78O\+uc>ku=]L+"kuz]u٪0&Z,\/A..mͲmWD`8& ߢqqPj]KT \ \jg6'l܊y_ =^ٻMKeRTꦺRԷ(u_|v#fa*2mcZ /U/'~~Yo0]%vǿ =w߿pN>?/qR8A ׽}|S_~޾h*˛5lSֳ\MDUP ٞ_Lc͗xzܾP yRI(xeNTu @D颶 @8eݦ8}"dS2@tH9RO"J=Wb"&IwItOZ#URӮubg-Bz!',m͛ZJ|9 Ī?nt$Ϲf>_;9fV89v@HpihL&~Y|iw¡ Ly>ɋTHr2RJ)pLL̩;Ja{O6s$Ǝ >ٵViu`F J=(R3B1x`OaD5dC I^rPEϗqPL@z!J(T6tRƅLkN:R\)==IOphqOO(%.=+LOL(͟~{gz'qiHqTB .k<ځLk!q$O'=q`:p?מN{:@kڼ3)\3CtLN)mq@*ϰS[o- 5w\5rfX{+^g^O>sMcl֝>J@8;K@"ʈ΁U)[>Qonea<~M5!Ǐ$(atDE'e1}jGcFYg:BQRToEt%a UFD*䝧5ҕ¨6+E%'5t2h*ݹ45UXmVC Q*;zteB2ئUkT[ ʚNW%cO< ~9x?]WX]Py %͢+]юنz9WW+XϺr9ˬ|Z,V˲/AAĩ২ξ+FрǗ3()d^0~)ǂC_q)ZDnMg54h١Tn(t4 iqi"B-t t(ճ6 XBk*U-th:]!J WHW y}*65tp)#mVd+1|]"R< `Y}'6=iZcfͷN3J:|)AH&+"ZCW.o?tQ^#])ІU۞t(YL{t ؼ'_2C -4(ҕΫ60]eB2ZNW[|ItŶ|%w\KWD % SWlb]=SB[öВ3攙ce;[2!ih:654p%m錖ѦtFiGӯ5b#n&~ˤz~Y#vIÛA8ƆFT8 ?A|đNc$x('~:ڧ^ x؏:_*}$”ᓼثb胿3o*#2>Q2fC9XUׁ܈|W?Y% :\~7G&(AsebI(] o7+Bp:ܴhA6h$'n~gIƶO* EsxHfN)bxЄ~X'WmN>4^10 f1cr9+:j{Շ8Dp Y4 cՔEĕT,V⣷h3h(LC\$.CWIZɎ?GbHvJbX!u2p%'/NgWIJZzp8!=vT*IѱH)o•PS+Qt2peT*I{pR>\IE&'W 0jO*z*pR$eWru(\AJJv*p TeWҜ1A%'WIZ~`rGK WWtϮ]8$O Ww?qݤOt6靤v;m] 9D3P:Po-' 4I\q20=~)5-L?C+Oo @\&-eWIJ.[zp\ $.WIZBpJi]Cvӥa?36<%,~O߾]I#VBGWlUseI]D~xn#zOUe2RLd-~l2 ¤Pz+@i?:}?7ۢšvS OgѨN?-Qfˆ8E"HE.u^{E aO_AiFog猧z~F(h~<í}ݻ ~͋%Zw}ho^|gܒyb[MQDzUXJݴ[j4&I N$izYpRd%%ϐp̫Uu+n } kbf2~X|縋E^;跩LYdŗXPF1qF]ᙦe^#M,s^I^ ВχūuU\ rK!-=Ynx;4Ro^|NoGoR.gÓr6?Dh%|8yEg6,\W:]i(aǣp_f\e^F)ks,׶ǍLe2@3`Yk7EF :jI?z>*WG)–P!RAa2[)WA ) NoB/ifӢ F'0\Fn].҃pquIM)pZ`Rʃ!牉^jF S"K>(>4XZw6?KLIx⧳8G/;R`v~IeW2=x 2$ IwLAmk-x2TDZ>ڝEQ-k9L,36BQ `z)(hPB`p{! ~ǘJWtdn0ˁ `3$6,T^G*#R"%10/%s2JaQiƝV,W@FdG!]cP`R;X+u72bDE"<' 02g IN26? [mi=1"S Td. NH=4 .p A*Β9M =I^,ݰ ´+w?Lwn(2ON, L !LQP4<vpͧ0o%P0 *:mq`yH ˥6|_C0Qݛ+vtW#6?$|0ت8[ Jw~}y9s7ׯF n1Ag5kEO+w' 6 ~7Ϣ7RƟ(>-KzWqCO8oi_Ъ}>ZglbmH"I1n(wRRTkؙۖ9'k`>h6,0/+qX |%Y;?^:1@Ⱥx2RxoyZ8㢎P¡Q5k `8jcm2ԴEFu0TđhhL' ho7¿կ֖ivY)GPe|#ۑ E"B({('XjcY56g_C3(#~:# 43rq@9؋cRwqKoڟ^w`MyeR.,:';+&?|%IsϷ) uuGY G`n[l~5XW`C s)@+Dhr-Rao< v@E,&W׵80 4Qi8J̴Q*c e2+gI1h@&Nw9#hk@̞!{kl7vT^W)h M3lS-"->h"&,̉Ѫ(4"QH=`(O Yī4a#A2IZ{1[,l<1qv󤛀tDy3HvA Yy,}kՆi)__wհ,IkO.`[^`[XA4pZ2tK99ؑ ZcGq`U-K:%Gn ݧ\X bnlhh&&VQt0] Ø8P=tHbaJk)]#J|0M^eH~}{*{Ta^m(" ;SH0'P8h tu}7~>k<>I?mے kZ"彜RT~NZ0dP֗6}Ոgkn72 Fyv Qo3*$) H sL(QmGs:2,"1rbrX+"mh$ ;g#:[Cc y8,l/T\8,S{x#^u]F .mŖ@o~Ҹ bdՒބ0P#8dZS)al-lgK?~`Q0a4hE{I`KJWEcЄK*0BXYL15eDDLh#2&"0lL=P'ָl<}J'ܽ5v^],iojc33^isޣ~j;(%{" 9;L[ 1*4*kc-62]5 hHQ1 50Cܘ8GFh4u.Nv|Ʀ(,-vEkY>e   z:''q9?CSACSAJCS㡩̿= 0徃r,!L/^&NPV0#(6c53m|#Uҭ+W'O`N!=Auw Q Y} &GVh1ܞl[ C?& 33v %7"JQXfXV8dn)!>֞N{:2Ƥ^z&PDs4F+RiaR6P(8kY9n'M9#m`='ƹylQOdEƹ`m̱l:8Bf1|%gS4ad75pq8֣,:?nXEʕ?ݑCڠyG3Bl)"3d4C!⤏bR:&*Gu1>}['(c;" !)9I=ᒀD&E(q1y N#Mw%ɢT>HvU5*GK_{8<2R}0C:`NfJ#h='eMJʐaBGA:xL0|+Њl0JeLYQ1>PT" 9 uP! hrIWtʩ@ֻ M9Eţu C3qv(NRvy]qɖ|IX5[FDrfZJ"[#JHEqC]H 1K/SZ˅Ii1- j m^FQkfw H]FR  &M+0 U&c pA{]H c!Z( gBL.ْ`66{ _$#z\mQN$ꎽG_#2s (z(f$bOzw(_.R+e3c]KܷyKhwjE:s^FtFjٺޡu61oko9 ~{$ܩşXw>Z9ԮUjڽ~jf@RĽEw\ݣPهTdqr3$ЎvU7ZAϽC(iw֧bxȹN9v aǴnS+/DeֿrKэ|eA25/꿺%pՍI7*AU,:n5v!~1&v^:JcyǘM;m *D ?-k컬~sW=R]k[ziTڌ:24RT()Y`U\brcy+ʹ}"{q|Ԇv9Aj/Jϔa76uows Aԝ%Y e Áŏv+Fv+nqDQ>LUQiW2թ 3i(}f,eacD'evʶ-Kl(2I{qIrVdt5Asٺv+q|Y%{0trKʊ~(&Uzӻe$wd.e-+ZN]$dcIFKl#N*{d aM:$btB2_2Tp E zSrg;$n84gǘ/MISF)){:'M ;!tT6dA*i/Maco~enCPfVg'cC(SW$lC"$,XS/JhKʽItlL:#.hA=:y>R"']fl,c/s2؈tlc]_}IN$tpoڪw,Mˁϴsj]7>.-4X ̅Pe:+0{Rῲ4e|H7ToSOV6(3 6z%J5A'8"'or1D2]Ik06{jAE[fodyOy4Db&k,QR|:$٠g'G6ʻ޾e:~ 7\|i0^{-o8aO+5 6"bZ:6t\A _NMMnGFFQJљHʁJJH.Eu'*TaRI28c(moVOtm8#JBL6AєΠ?5e`SEIf (.fd+E,266VflWGw 6Kfq187'cqMmvQ_eE9oW1aaoyv.6))NjCBFS'()³;0)2jkJb ’I$0h% |N&'-R g3[]R/H&,=qEʇj,{NXexyώ/uA珏-Ͼrf M[La2RvDT(I'ڲʲï=X@AMƈ1hL\2l9\D41b7g;bIB1ǂFXP'Ԟ,ػF|1.r "yۤR0A`aȫHUςMc&jw[f,Id̖DfQeX2بpd>BIDfܬ`g<i8:G#%&P4 -4dPȠYU(aC?O;Q E&rŃK֭xo2$ɷf (Z#*#9iUZЭ٥4q_Am\7;?:oFdo9}Uɤu^?߽Ze{U7/9/*_V]^0qA=?=LZ/Yo|ڈW| !m8|j!%_6%܅7ydž5]>ٺ8XwlMٰ~a`sMGmVroq @=_D°7ko&XFzzƣqy~3./fks?^]W,0{1pUEx)pU?\U) ^:\]\TWX>#\O\p Wtf\p+j߮Z]TW U^_{Y_99x; n: p&-B]?ե&|I Q oJI/Y`tWK*-t"L[)%L,n-I O˰џ@'5e/̳'y0h:'Q\^d9︁OA|YևYg3&ҨGW>@ۭuT*!zB_lߌz*1X +SJӝt5?/)VB~y9?Λ^gCkR@,u>xАEiȶ{𥒱ƵwYb0  +@աALW8ˏLCE'dHW3tD!좵);@OcTM}/T}1j(|)J؛g4UJ'F 2E(T WU`WU\m^ \UiXJvoP}Ŷ^zA} 5崺Y]bor|_|{ugzTW7E/>x*D!,Wq 4qҞ}lAu)Da&WAFhlj6f]tQtEtuC A25@:oSVJl6HAX['h[WIP|T$ Q08WF*XSeZ `06.L 5S VgpPSЂ'a}Op *{FT !O18Ub8<g Iي5B̝stFӓ99RSdQ,( ʮxmi @%M2YK5!5*-6 "i漐UYgql&Ύ@t1+1>y~s8[I^&7t1upu͕ qOj Cb "hAJdQyğ5e %a)TT Ln :b] %l^@Q+JUhI85V;+|~V _{5ǻ*)Vyy,YH3-eGb_ O]c+M<#e 7cY òe3ߘXa؋b)E`k ƁVR tFIQO(tXa,K~qurRɔSr LDjV8; 4(TIS\?4o'QHkx~f[yҹhrV6)IMm-jSAG%6uib/L[tbx|u oL[2bFg0Th,+c$I/YlIy~нh{03D"$%)YJeUmIdQ_Dơ h#EFrrQ Xn)&x4t̺ƬMU E}AMm.KBDΈgIʃ&RvYtvSMM!'S2?u>IN ʹI5#He h$}@IF<"P5&`T1NʘrS!nrq68?gm>ɕo{?i?vAS\8]|krZkJ@l';xTOs7o{waث8pD!f'BOYRv$|&o6~" 2H#䊜#DcLܞL(/OճO-IwUƁDDyDP5n*_[9pgUtkȆ,?N?ޕ-H٢|mRe(|is.N͜«rj5bl J`>T͚gY~=|zzQ}~m&Z,siӠ~1[[fkb G_XA~ iun.뺡nV7GdR> U̧h8/dg?59R$z+}^'ٸ<;}~ˀIR/S8hf_w*=pW_?ٻπxwx W`= 'CP~t_w= diko5ӼI׺YW _MDSw}r2#d >~0|ߟ\v^-Ss8+<{a6"HasTe(Ed!GAyR(ա)Sީ{H׽H)Fa(Jxp,53xwU@pQm'Ɔ+JzL@xwTܖӑZQT^-Oւ ߚFꍆ)$EAw5MgiNgWR =q`8`{;ۑgcg vmG sb< R<*y (uU&x@ʴ6?CkBR 1|Zp;d悸0Vs" -kڅmql崂<9>xfhณU'+rQ偲AmFt,HٟVVE@i  XHE)1AX'-*UiײZp0Hъ>M@ K%"5_q:Pe{q(۝mW;;e9-8}<Ȇ^Ϥ<rHv}5$Mjk[!^^,U@jV^z 5-p#X FZ|cޣ:w/W׮9ukmi F.Y>*!w ;>,Hd$t 6ZͰX3ܭ B՟,԰ .8᪰@xFp-YN v*~,JFǿ.DONto~7M1_YpiHE&%A*긲ƺ@cJ J"qZy bT1'WԸR-T2@אt?G=F(u ~_%8|@SgP6 /BD:hgkm ^ ( 3A*+]Gqsn,)B +8PXlaKNt A]cdoPFc"Ql'2ǐi,@Q)20](&cp :g4d*Klr9B<-ZNd3-TNǷ%BI2!8oA+G[Nuy&i,`f)if(Y{HWpsңiIӭxn.i wm2c%&~֓SxVOl]`ru1k<.p͞}{5KزE[ݵw(/'^7r~E~͟]ιɞ S,R˘Wj,=?ʬwdZlMfNvl5׉j4=Et(K2hnn\^P2 {%ʗ%VcOGZKjӽnJlsPb?ΰJ ww /i 7JsnQ&Fh8b%}d"£(8@'aUCjWj*$^JdQeYT1g`8op_qg٩'~yomR?ӋOW]++v JgҤ?T 'GOp(jJ!AH![؛ '\ 2L $&p|ͥA jkUz%\w0Fpb#{i0Ko\c?nl{;G MlY;, -'9XĢUШ-HPIT"n:N"gB(%qo3jc"0geP0$y4 A:N9"NI4*cӄFC. rTTZo",Hm#DJ6D=`;_١A1uu^fc PQ #b jM&@Yg^:QW(&C1S J) JnTӆ11.ɣ$mTe/T#q0ȏ~ӼVJhme+cPg|&s ;GRN~Z܌Z0$8 M.xpB F.CjS.| OV6P-9IF!"H r~)-j?Ψª>p+0UF]/qFPB-ўn!72ouY6| Upِ\ N(!ϹJ92+ iKUFULTPHd|DwmYDQ&pORDgXFcqY\['4a۫z*/*yRyTqk<:@0l0X8JZ4'aRQj#ZX<tH (Bd 6* YmB!նdlm8 >[*(3H:Y 'H >rhAF z<:a"`<DKD%bk1%WHNtJQĒ4:PSSrlN%V!!Ɓ3uS/( gNa5:M0I뺾Yuf 0b5;$[ggrQv,EY.HHt#/g6V)S!Eye,HirqrPagP䡼<܁{Fk8OE\~|$Gg{?[ l 0I=6ΔHG0A2В,qg qqG;w=WF(ʠ@Xa+`9S 8/FQrM$Q:=V((C1EĖPMn0 ^5:ۄ&3M<0R{n t]}#L6O+[^o2x*4yf?KU:R2i*Q4MM:BWj8 Dw^_#y^êK?zXmg/IzVK}k3jVG 1ZJޅoTgFfG >< ]uh kNiCߦYFx7|{lJENipgѭh2? f5{[4vz<{ /h.^MjQAvgnF:9.`6_yUӵb WzK9s)g.̥3rf\ʙK9̥y^3rR\ʙK9s)g.̥-^()ZW'; ጼ-Fp8*%Iʼp:<1PʹQᱨoEUzH*g0cSXNQ3xR(4t;!ȱy!uu &x 0Q8*9@"HuͰNIJJ{A8|ptLf8fq6j52.|wZ=L4_s;KL?Nc#>:i5%' 8MgNR1o?).HԎR KH.uEm{_}dRHv( !L9 hSwl*\j8u9}7 N[lBP^˥´ TW~:8O3( j*Ly44>@}S'×bdj!Rl4Z .,G%&X`TLձ*9vFHÇk'.|jO&74M]4ɓi ~MT6o^/^\O.^.;*\ϽA..zѷKƑb0|%Ly!R\-Kj+CڋY,o`0 z0bŇYGW2{WɺQZ*AK5Զb o^jaFRñKY_W"|ac_3hMMfAp%L?8:ߧo^^9&^yv?ALMḵ UAP֤YTmV]zUwrW,+ho|˭o?Wu'ek6g?!\Gڰ̯(sp,ZZ+,b+_Eަj.GA}:6 wG7WH`)CbI_q-*E֎J,mysb 6Z#0yez8$L 8PaQ5KA&0AoT .Xeփi5&L ѡ8w6'pp ey*<ێR>PaݞS zTK!ro0v}@;515xh³#jjߛc3MoL{mr0 >}H_OzhEe?Pg01rԝ8SaۀSu6)B8~p mҸPA-G#>~S;_q<*|lmI R҈0\U+j_6``-_E%@s4|wAf ꃒ+[ڕ_Lb~Tt7(U2xi23Ńd9*#GY|\4odBiLus֎DZv(G;CGMT~)nz4T(h㺼E(yu0@%.-'beP%=%_mPosQzzPo^pZubm7sϲ;[ݗoM~xh&&VQpØ8P=tHbaJ[!:8STu2a:]ba}k)oC\22 K{'|\槡k]=hfl~ ? ZNpRێoϸ; $f}2 ֬b^LP}O DŽdPq*T3/oi~B6KaTݶ3{$G(CjHQ?^g:qIbN,pR*CyIHhˡ"Pw^=+$zIJkC!nH`DI}d0gT *H-R*IPsL(QmGs :2,"1rbrX+ mh:xgl ]ސЇd |d[ct8o+D\:y zAvt|dH!vܸKbbqܛp5RwhpJ&)ǡԚJX㖞g~ DP/(`؃}*wҠ q$r%+W@&X`T9FF"`"TSSFDD yI0|bҘtlydl+G/|}2>N+quc7Ύꡛ'Ηzr٤y{G>|E 1bOAT GbBx0S%S 1*4*kc-62]5 hX9b0jNu欳l^wg$]{+kb>ܭS4A Y>fk)AuNf$~J[g!ɋ$y0ւb_ϒFvOQXEpKIueF'2:KT:0JFD\5̏R-)-e>*qVj|oDUjֆoa,{]k9lfkzY-[Z6 >U2v%?v+k.g.A`5o4t+0aR~tA=O`RGTRJm@)$aLfr޼9z)qMv*7Ζi |9|*67gnoHJ!Pa'C}AHdNj;5|3!*]ֻ?&iCJb6d@#@$JQ-Dv,Y#|FHEz@Mz@Fz@eJ1Id^z)xuhER*#6&˾:JgpJڤ /+ a_ga[X*=S(tsx8F*6)6炁lc4e$"_jY"M = ӵwАlmwExlR="BIW}x}(?"Q%ȒEJK/F\Wu7tmAc ERDTRDiU,"wkT!لUX-zg՘ILe4zlMB iبYup{k0bKlAf Z}3fvw7o/$\C|y&(9f7d=݅z8y%ư.ۺkV%r:$L92 :^UP bxVOH]B)u٤^=tڐ5g͟%6\ؗizQ|nwP_[ܲnH9o՝7jT= 8_S0\ݎc,@)Nw8㦛i\*A%  GEu ;Iyvm xCһ}%A3ʔr!H* RC )H0^S2 :$MEd!**".j|L^ 0$*#ȅs>s5dp HT҄>q8CE*H\AD p43PřəsT9* Y@/]2Vl8#*Gλ=JђFL?fqM2v-G ls'&G1ʥ igOT+c荐`*x!P hp##I&)*S3=y^ W9^kHF"U$xL2'DSynY+EY50r8=1r<H@JS@wGsJA[-cTuBL!:XELH- KiSnwM9;(a8dch۠v2J{vnY^*y'>2ni28euŔ8,EB7:GAPdPa|w*\@xR(|;88f2 If^KN tn<X9߰YzCaRQW+zY}ot0r1ݸF&]j\;' ۅGUϘfQ4R+=H"*+#U+n\wyK:a\r~}(]WrSYhH`_[}mS;}ČĈxn&Vm ][Bזе%tm ][B#g0g?PYcb{:=.C?']1$;g\'$}vܟ\rH9H܏#mM;?D3WTRQT5Sr;p|uQF ࡹ]*޹&$4F h̊yEu^FAK<7dL5`Ej18gϳg=0VL*-M/.+R`U4oZg]2THm%Z[ɱk%W*1R>8L:8QcPD-s8:mQ)2ZaB| Q &&ruPHY"!(O΅[4rLΧ(6_@*캵[M=]7~zq=y/@(uK-kPϊEf6%ev~/zzqɗ>ҭL+8|7NY0gkAߔn w|h>e=ɷ' [H^:s>̦ G}z?Biߙ-_K f[XlS/N,.3zeKmі%_!;˞[͟7W |r4\Aiw-)mBWء\_r_J̽e*X3ÃjX@`Kx5J{Yܟez{$2qB∕\Dd`"QFOuQ8ӄv?HGRveCcʹbGvQ^;y CeB\9&tHBTUD8"d]ո A3K8"њ{. NIJ1.tgukVGƊ/v?.QrieX-G@Q]u%ҧDs+c@[ufEMu. OM%+c"&βvL:ٗXnnylm/m=oBl09ʼn#Ȅޅ>S ٟT`R6&笓Ri,3y{%]ti+N|S/ `{#;V:5mٞgc9jiNϙ*wNhZE`J@EĠ@9(Bd05'<-@A8C#;,I8 ҉"ņ}ƶgmGa~t)5e3ϊ(*oak\0{ **/@$`-_`)8puG&EFU 8"C!D*$X1$R*&Xl8ۑR.f싅p$G Ts\^ex`3;8z/;gg?N/q\&Y-JjB4DIp,/S+d0̉,hcΞ 8g6#'!(fJm-'JFbَnz\cAbq, I8NsEi#QrJ{/N}8 84UЎAg7 z2^Q!քHdi/pD IуjOxXl8uǾ( #Xq %9\{or @2ɡGn*6KP՝NH!-+FSBNME׹8p'@q(x<Fb)vE%1:iɾ 㢬XqqSL"PN0?si2  *T} -+.iDZa[]:ʚv |/$D9sPųۥs޵.4k:lB/rAwD%&hy+l^0]E۲S'W5c{kiЭghU h5鬭q*z}/i%;7QovUGqO2]?lKD)/SݟثY|p9|_-.?lVgyюL8q0\] ƞ_e椳|qO9|UoEۂ [7CSv$\ZlN&=ay۠ۤ~2o ,^褋|O~yҕe_ ao綋M jzz5j&@\Mzso7Kө{B |t:P-%޳q$+_vH]7~:d{NkCň"%O.i(ZI<,[fwuOUu݋W̡r>{Y&ÛIف1vp:)N-餏Κ1wRyBٜ U.-Rfʰ R!&AKRE!$+Ui5 L ͽ"E<*%MBcJ0 ̠<"!~p9)yFacp+k7cJz f@C힖%j-OSCw,&!8d HHNG˓Pij"b!J+L D@ H*5؀g23BŶ}c ụ*EMx+7^GÕ h&2.A<mq$lkJT $hG -BbNjN{u)e2}|̐zDeA;otXPޙjĨ&_C1*pFKHZ2:mK>l1M%*w4v@4vBG?Tɏ?kZmwS4I[_y/{܏9Ѝť.m0PIB%*+8"R$Lz$?p?48 Mѳ1Hcd֊"݌6x0UBaA-A=ƒvr}6+kx-/*-:!&2%U+JVJS@P2Hjھ,T \ĵ"ԤaY@XbSԣ!$E(@-hkq>a9};%fb Y))2 V&m)b"H 'ŨM&RsQ7Xܵ1zJ rsHd YY `2Kp#v N%E#hK/9M4IJ@J[["8Yq6 i|\ʝ`8< j9f gE/:{1M{fv(AT☎#/59n4%7 ,ASe_~;uHo,v2jnqIfguu=y{A* cI=)6I#0 @0~4fGHD ‘Lҹ|9LŒCªK$啂R"t#Kp:~8B8)y7˳ybvu;7v5Gԯ|5zr> {<LNh׷f,C!Vo"qIirq8=GfËi~MT=~rz:ͺ`v0}tv®v8\&>oy4|5WWByJMex26O7-h88jymsWlM6ڴWNJBZHYXctԗ^ap,I?'lˤyo^O B7CO'x{ܜ?/tR8jA ¿|K[ҴbiapM&rúWɇ W~X^5 }2G.kGqnYJ*.{q|ٯ'̧U@XdGR7}oJP7lVhS* MڳБrJ М+ȼ1Z# gQs9J='Mijӑذ`yUM=a}2"mnlp, nřKF+Kd}$M[\ڎOgC)Nܱ\v5oYmz)Vua;SZv/eq_ $>J\YgLdqW(Ȥt J;g;exYabU=dpAH%@MVmH$fl"!6[@׬0 0xe61pOI'୍(bVz$8o|mg0W32mKT49f舿_ä2y3y~K[o(x9v#IGJ[M:kb2p4oS>W$ԇ$V]U00[ym,5jtR>RRsm]"UGR̘z_k/{mDjbE{Xa빛؆B-IAHV0EDIh:gE4h()5Z(}m@Ɇ  H^=MU >J-t)+w'b 4GkG߽jizwl84lYRq7+5Qnf i[ob䝌R͢X$;>~G `vS/-jK^Po~l7|'=g U#dO^WSWxl[W}-5rjK3ɬ}ߌ~؂$yeOJ3b LѩTkK#ӉW Aqu퍵ct3 :IE&L&iyiyBJBHu),/RTƬe,k {/mD--ZvqjCbܨⳘT}hTJ֫JG?F,Vw.SUȕϣ#xXj~oA>๫X2BrReEvg'~v,9u %Z 4f K%9nL+R(L@NnƔUn<9#T`ƹlgk6]_I7qmכ=,ea 6]Bt6f~=t[sYgIq"5H(4T t H+@]Z2FA7m,\j[,"\H:}"M&'9A GDsmW|o )F9h4MagO:q[CޛuYf,IL!<_+y2Va}{Z ZkաZ~Vtv~J쪀+Lc>АBCEv%WtVJw0<{sl2؏>}ЯtFb *DS?\X j&:ór{7uDpI)0":'U]I=^J X%C<+:eB,<s4aBE!sǪ)2΄tPG20'yJhyd" 2rֲ(X Rs\OvC{cw1)w=֋W;}*WPHl:ړ:'8;q"p+xBrYHtg+c8^YZ>Ͳ}7ڛԪkL"s8`I.qQ?btI4H\86+TQiQ;b=T<'FXCY /# i$`# t}5{(fu->%WG$f%xzn"&ӆ3d]G&'x$F@aנFRN7A0 *qR뺰(?ob޽tM^\g5N{s&ٿr~IKjsX)W˝es1sCl"B/߽{qo3+; Ǿ0M0YJU!MVڜ{HlfW謪ZcyV֓Onq_O! tY_܀v ]wc08ܵv˶s ?kXJAף돧&#TP bE-1D,+"x"t⡋Љ+BG*;:WEN`c^kT`]p.d\5niƫ ax=Bmj7>}v-ԝ fHun;DLE:vqud\t՝:copy@}Ad\e\=9)j|֯ ѕk;OE˿~.j߀V4fBSfb;@҇$ -J:N4A{|Kж8RAKuπ uɔḑݒG9x$*.Hj9HR*$,b< r‘FeFk蛜P r8R+7+ytV?t`je<,Z_d.|LyU]4>g|-Soݧ~Y#?ȭMF,M tme339"0oE%m0_FWۧNnt9f7kOFV4cn;k3>\Ya浑tlot{՞wyoG O~z+=[u܆/^v[E՘)8gܡ.u1%*CEoػmyj4uzuޗ)A %D}%1")Lf*r-äR^x0Yl EMGt})BDRGYK_]_O:p!*X1*J{L`!阣!HJ)iA?{Frl 7H]6Ǜ \XX"eRקzEsDJjY#i =ӧjI0+A%~@GA"K10^ 0X,cB$Pއ8Ar l^0܌*ەW91pz'xzR=Zt~zr9s,c8"pC4֡ dh &"F@IlDdLg[ ElBπx rǙ ZIBe1^+I3OWDZzwu#Wndvy3ܟ❜+c3;nuVꐅDYճǛEջwҶc@7gO_4=ΜUBޝYY.kWT_Yw ͉3 %&xHe.Ip!2ZBRUFy,0#Lj]XaBcJLO{R+Q9TՖ:\/OLoã_V2t¢:rG~avOqK5z"uGXD0d%FI]$3ޒx iפC,-ǴB!KHAs4Ԥhz!T1qeKȷƒaYurL9 #M*=r&".X4LD.A<)}1mD6i &P] uգu%%%$A &y^f:| 9\&.y*PVyojAc]6rY@|Ť5vNS$.96$%*:#P}e~Eګ.:4B;]lxoq 'rZťyq7Σ͗}g`JA:57S*E *a~L W~eQKaJJy*#Y?ޚwʙMxBY@3dayK8+ OA.oӭy"Ͽ9~iAYͽ3cuFF$Njc\ @dQGy%.#˫i S_,׫{8`7:9ස}4ai7[l7ҿVK[q7Ur;o| 177w&8-ϼ:0,ȠI*&7R,eU-ghy"wagGagmU:)nr'q.MX*ĒS*\9!"3L{:n}2~p}+P6'#oH*b)q"F2JAJKO#u-[Ŝq(uKA%RebreeIzRÅUH^Eo PYQ-gRJ3#+:Ͷ{!%7eV{gԡ**w3e$Gڝ!Z0.c6قL"L;q\5I倅9ZW*̸-Ͽټ/Nh׳#6!/xڛLHte%|v_J rrYz6cEZ\AKe4Fɶch36IbN!e VFjُ~:&f_PtM38Kʑl+Qy儗CPTk 2vjRQ$ !HH&&4O&HEYEȂLfGڃrakԯ& 0}QUFD5  ro+wz3"[)&,v6{LnsD+J+xUDi#6NJ YH2 g|K{i {dIsi3 v:m|wQ{g|դ侸h*pq$%D dE*DVQ-EPzZK >VF.>.IG_^nȽg[QA OQU~l.NsD s c&Fl|1.(S d9xGwT=wwT=wTwQY"&Y^9d5KmeN_,[fm$ zH#40Ys>J'WIWYZri ~y~lsn~|{XZҾP{3L9Oe٫$0TM3'Bɍtj:.VL]::]:NS?hlQ]FjFX+րz4mƊX ˒Ѥ XeX%D") \WV3"zσ<*g2"j-ɍS>&iVHN[͝ɡHtxe[m9̼ l>壝ƃ6ˈvݡ㝇ظ95POq/Kor|6'!rmf+Z@0V 1:0 -nARd\ґᧂCg'k0cZ9WA cvO<l:]4ן=[z& mnb,ȲT"f 5_du~B7/k?4{tǝҋj7Zq_?:uHOS-8ɦđ!Fj|YPS&rϼGd @0~tN ܟ}=_3,?[}G͒$U9X.0bRRԴsˆrWoNN> J˲|a}>u1ő;k:.\ "cx- ysk<Kx!-/mAgє4{dI+'NKn5*ƥ0qM˳GӋk~m]T\|SWo7َ [D 9w'l|r^ h2|U@|ro}sO6p{&vt5 wO),IicL[|;Xbŧ=s|>t.k֮kKзu^i#ebEtW4Qc8teo5A5-OG߽O?|޽?ۏv?L6%/Z>dL׷?>嵦VS L5zyϋLռyՇJ^_ok̎zmH~a2/27E*LɣZ6.x> !U*0x<Cy1_H7?h_d"+c]yul1v:֭M6q' z[EH9%NJ"2(Azjw2nxIrY >!_& IUp9ȥ 3jg>LLj9+j$DoBV S/ H]_n`];呯  >߆`=3 >`0ɵBY_Ɠ㎈O?-Żd\G،.Ʈao6źrLJ:W[ "j+ވ5rkmSiL=]Oٔ]Oݕ⹔]OˮC/%Q+,V(gm4eTٌZ|!,i$YZyD'2W$uzk*`M ֤6 oFTGF9((C DɸEJE [x7r 8M(rnSTGEdQ0A[R.k%% $E? ]miה&Y̫>:ڋu< %t(yD Vh1?YςZB Vh -X+`ЂZ<˓=K4*X+`ЂZB [ -+`/X+`ЂZB Vh -X+`ЂZB Vh -X+ЂZBKΣHmI -(X+`ЂZB VhZHy -VR -X+`ЂZB_>#*?s .+4 B5ҔZ XSk*X+`ЂZ<|1QX~|Ҥ47݄EkޅZڊ-*m|-zG|Qیֳ?8pqrq{yEa4]9Fmrw-YZ1ϟ;.+3y6% + K*Y>V-}[hYN1)KO14ъhTF**5p8cX:g)<]-XC:8j9]jHЁT9#fHuߢ}d`1}RI|WDzߚ[j:2PI\5_2B2(:+T:4R~01؃,yN9K/$8% KqIRm ɭ6B9gkѲTĂjX`:R]``J)%i^xIOO:㜾eiלoYsǫJ>wPpYtF`EQ`mrm(βTi q > x|! Gp fb,YydcΝ"!) HnȈbtĄr;zٺs\WVW Zu/@tlؔqxs)/IzOT jYNhNoug}~~K]w 1 -0Ǫl:ű$Q+-9R 3/#Wb?&GdHɆ(G!n9G,c+-&TJk"ק8ͯ4#؏ Pn^h4" o%C1`#t/V "1TB֌Mu?WuR=վ6 u,?DS>EǛ,|N) > )٫Ouڇ'R=D.uoE}?EEoGSvwUAty{n&?} =E׊vQc.\:$=R08 TpD:- hg1N:u}At+ǔ؁ ߐW|Sr4 ~ͮ9ud}A_At[|EqE1Dɭi+|joEEBQ̺7 qܫ, yO%/{׈ȯŲYWR/kj`dbԾfZ m-tsvt$ɓIsOPﵗ4dρkkDkv2]i>lSNj n }0H5,4ߚ?oQkQ~N}ζ{ҋ}gY6}Mm_}w0;G.*k}.`:~jͽ3w 1A Xw|ey&M;=/Iܛw7"9B%4jha>r2%R&FbȎHw ::bHð²س/,%TLyr^9?p<\p9D J cN0:\Wq)@ұӘhbbNpfEHˢxS/[l6olfQ;POk0{E5S2څ1VρkhsGJCam-\w|^5.ǠZqb ,9,*c3F8'tp.:]&-˄Ik rZM<1/ED.] K ELJ MUVSZ<䐤2 #52(BCE N&x3ٺsT]њ֧NԌFD?d`/ChEW93 30N45 j`I"dHXGIbPLIX5FҎY | \{GK Fel{p)rAv4B"Xi@J0N+AXb5J]PcS p* )""|lH%1ڛ=j~T.'U$cD8bӞ`2nC)Kt>Yu >U,2vA26 ./#/Z^5Pwv< *'  D:[ҭEii 3bRb*叫/HWaqSzP(# E-*g'D x?VK81Ssl\c% @^ЉdA]R :T9J9Tٛ<~cɻoӘ\ b%EI-҂cJ?r0OXX&DȬca %Q#K]pGHGz+ۣńحr{<^NqM޿;s\DyT b2u\Ҽ,CRD#i>Jp44*\TVr3((dQJu<1]k4Jan zfӿZUQeMXx[J3*AR鄡O r#dr._-Gf&5lS[[EH(򕖜UN)C0UG YP|5}ڬgިh~ݫ{ݏwNZ1Oժo!R+ˏEz2ՋpFOTчNx꼫&li\$>KeFNҬ(ŵ&?"3D0EĔ JH`.`Y̾̿ؔMR5W3x7asꛖ{%{;_VfO=~iF?vtqESmN"00@xCtikP#?E=2YT\ ~ba7ߴ–~O!3pK 7kS*`+搮RoPX $Aݥ(TMJqXtƿ7Ok5|uS3KT:,h|Kvkǭܽ vVIKx 4DGTEHޏUeVjldC!MҾ $/ В}JR}ِ쫟>ex)9ftIjeh=B !'+LbA R Ep$QO 賝,ŠOK\fgoNA1glc4K3P^1:S'y>97|SL!Qζ3Qv|3|gD0tNHފ$Lٹܣz}ݣӧt{\rsF^OR!=SQ;|G{rI;SSDkP^y,ȡa&5m}7?r,m7ZԡWQUJLRoq%TlU6പś65Jz˧x[חζ5w(e uf]t-OڇfUt|#U,n_An5wųL;ʷg/xVhn\߯. bt[6 t7=oίWX|^~{m\@o/nDF",Բӳ;~]-ݎWsʫS|z h^0Y0m=w2N~YZ0Gu;nIasϕ28# uq -Ի9{5 T 꾸%]rys<s~ Ya ~ZkPӎ,9!яk"#\-;//./Ɔ֌am),Lo^V1[}"?b0λU>|c0*lUX;MMTM*~YTh5RP$Dr]\QJ$.~\Ҙ9FrFK+zodBTK4iu[şbx/wֲp׽Z^:_^Z'ۤs6>#U"JU#rF;flo4-`mBYa|pH6P.ZUD9|>)".֢iܦ9ʫ^tyy,vRt|{-%9i"H1dE׾> MIRTEPVPllL\T$°$aC*ImE $(iU&IȽKbdTpV*SL~HAfK9!Ki*5k75(k*T%E9+SJU[}Z@6- ŅBlFc)%e3פh=J炈`%6Wc nGՠ 6vJ6f:V^UM#+ ԥeO[؋6vB6[Wgmku:[>ƶzAsI9G0/ij֨3M[ uKl}樇l.}ɻ\Oz8{&??:˟ov}:{_~:SVS/Q.L*B>{5!l;CvۯG3#eշoٳY0DG vՉq~;o|͂џRpn+Gӄ!L0ӄ!iAs۹vn~;7os۹!vn~;7os۹vFvn~;7qn~;7og\os۹vn~{vZmݕ2{Y_,~>ìŽ8A *{N=r=47n_`enwEwH~wͿM:czڻ*]g1Cx|w6`{X*{ a&m:= fv}Xf@7+)ӥw)K w Ţ.[R><1]8bZF4pbqnvGz"C0:}|pBE$Mr4F;i͔D>JlgƦ^r ooo"i|f í._;jW˳Rوk-RV*P%*-1䶬Rb(*g,8KItɒTJHLBo8Tt.F7 ƪ}sdtvO; SZ#z-O)I;Ԋ ߪȽXy?=:K9K=g)"Ŷ~¦ jmT dti#XƦ'h38fs̍IYVҐSkͪJh9Ũ<#DأeoUTȐ0ڄMc zd2Vh. ` yERQ*,t, Gc.+,VR[WE"uqA{$hW/嫘CV3 Fڲ'X-O@ɐ&מxœ-}87GEnUXԬ%X;G9UJrVQ{\ V#D7})Ȕx A_t'RF(6ǯcoba&bAZVqLڀinJa(˼SXy2lr6bU(1V|U6fL b&Nso˨ 0̒B4J R5 ) gP!]Š>X ZN- O9bBrf \ v*7\jvq!N۠ O&jM8<ɗ>xd0)Or %@?eiLX@Y'E)ޡV9SVgHn%,a[U\ne*UCSbud͐~2\A}FP(x/RRXHAT*L)'L1rVLF%# be&#OT,|3.9ga 2oҔ Ȃ@ HCmZʔEr qkJxf#~R4v r `6s0V9IA1Μ[@HAKeQ+c%Mje>Ai8A)]أ n-,&**fcDe*~0'"B .}i9L|QQ>4Kٙ HXܦIjzJ5W .bXYdlbP}i -Uu󘂐G}Z񸰎|IETqMTx'2Lbyu!|`c%i[+{0.eѴ8{*@đ1}IGPVhݚ'76&QOdA*X?VyݏtsZmE4Ykm븶4h43c R7m $Mķ@o acTD*{~>DI<-ےy3{^{53kM ,GA0vFz!1-~079l-ҋAq@!k3]p_ %Ȣ p R 2 c1ƛNZArIvV056\ ZXuNpnYh}3 ^_Ebi@N Ǎpx/Jq|PYz SZ( q$JHMГJ $󋆗I`V[͠c' Eu˥U10m 1nV""Ĉ/] :(R-0 Q:W1t+#2/Zx5 .zՃY07!@- (d bNj"bK'L#gWWV[#Zm "bw};])w2uЕtSwY]C67RRˆ޶n69Q!&dG֋ܽ$%VNJX? )a ]Ҷ H A.tx8UL1еka!Cs;7u^g+:F}.7V7M}4JMu^^= L UW->I QA?RZ S vs9>{{ѽ=ý=י2s\':1׉Nubs\':1׉Nubs\':1׉Nubs\':1׉Nubs\':1׉Nubs\':1׉2[KsN"*w~TL'`=ѓł2ox1tȅ].~~>c|2"hquW}HPcjLֶ$ &O2ڋ\Vca8*4O_jQ_zpW1 e{S^#گSzx͍lX=blzq&_G/1F7 fkOGPxt"Olœ2}d%j5R?>֓[ n 4kv]'Kw|0Ǵc3U/yPlph?23zW֊#=zcwS=awnJ?huiuc+TΈ)h.u=ĂAq)sG|P!(49K7]HiWYZ𫖌c!M.rGj=L_t˕7 +_,_kRi >cɍ&eYHd0Qz0Vz< /[>5s>u>vGra$M֍ɺmᚱXx.l V+K3#kU=J&,\N-Y:< 2go'F ML$)L UXkJa:9q \Unj˭Aͽ4Dr;~k*AoeߢK >"2P[*=TC EoƘy SuaOƳӈ~]r||U||coy>ezϲj;pgO_?{7?qaO~Z``ۄe7۳/ډFyUmo/2=d{ʽ>}0e};0;M )_x$eMFW}䬽@~#2U!霋mc@&$gIsK06 $s0҇^lX=yH +. qO=du{TU_TUmo^uB .xiOg=Y_svGw]glL\Yg`{20J0;g~v cVz Ɩ0'௟ufH\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\I\?Yqy!B<&jT]`-[T}]Ęs"akhhTa*WkWUVGʟfH1UVh%_2˲*U1݄l}* v;0BCxTڽ{ngVKl;)Bߥ{,߻fZ)`4d,^Er\渜ɀ8&0E+5(}@3vuQ7N إWBIJmf󏛏d\K WfWֶ[loxAA1#Y_X<*?OO0 qo V58yᒥD^':׉Nu"yD^':׉Nu"yD^':׉Nu"yD^':׉Nu"yD^':׉Nu"yD^':׉Nu"un#7aNfӓvGXecuPDxeze^@@0\D.Q:QGÁ7{}'9WhR=>}]WI]0o-l;o1 >f_w2+A`,<<3"%7/-em'ĢX3<&"G#k[%quA8$i:\A`7B ;^WA6)DӨEcԍȦeaFIs 45Zg#j>qwlmc2$u΋hgL{mQPiM@U6ÁJ#E*@CPh0snnx臤 $ T5v͚r?G7,Y"NÉnuQo]YJ:hpGF.!펂7@vCc/ 1@fA#!b\{O/k^)r<C펇]Nf_-">4p'<_)uk(dS7TxTϡ.O~wj/Ef6rvt{^'@ $2$׮Z|@+E%`h% :iS0BCIj!>~ d|4{;?fggۍ'WlLxԭ?;ӳ?Rp6RFef_|mH1}̗<uf/ާ{uHHHHHHHHHHHHHHHHHHHHHHHy~ w/FOOQ Ps-ݏ:j}H(u?& ѠGA!kPJIsԠ@əHc4sz)yM_Ч(!GDA/%8b)gH)HS쁃2e溕z̮xc6<P?$ A)@S fݎy.yS_q]e|vaþӘ1?^|g>ů>n<)< 㬕JT2&Ea9@m=̵ Vbq!N835t:/=3"y2=ف+Hz#Ih[tؿqvV=U2ϛ7ɉ8{+RFy7ΧT^/) ZՁգ^Au_ug}æ񷓓'h7e[ʴ>ReeZY5k %7dHRH) MA4`" QukcyN |M5~qZP[竦Lm SR)m'@| } OF<Vz #l{+273{hD<[3++Mۃ >)i/2.InىU.-6PdGƖ7Jz+|JY\`$;!fqx,dOe˲Zj˔s~&XU_lpaM*ƉvSK`8ꍝ((^ۈ,djU#L%s&" xU 67 xCNJ`X|hpY^U3 ο9{/0O<]~\t.6\hDK"Ҡ_3GH q1v 譺o@tlؔA9t -n"x ZMѻXV,:o+! ,}ldn|[zLW\Xz"k'1Z0+ɝmJtk)^wM( 5fC̆Zg5aT\jRIP0]/5_A`?URy#Ёnu1%C%$MjbVՊSV4N:2NAnO"`!\S&XJ "J&@*',`Jūmk!EФ^`Z@ui2b9f6%m~Ozni[|Lem5uYl'~wS3vptϣn~:VލpfKG68Wx^#[1rzW?5?J 3xp멳cqhi=pmv}r훇c^^Tz vhC7<꛿o.4!Ji\enw{hȥ&6F{є`Ke[WT hgl{7XJDFAk`g[..f?U@A@^ \`Jq* LdNOu| 1ރfM"kݔQETqygYvtq OnN>~RUʹb"(KډH :d- U e4cQD]'y-xJATV@Yy!r59ϧ*8-HTф>OVRkkmFڌ+\5ᓇ<_Vh#2Z݁# sOWK+(W6P΀OЂ荐`4m,D0@A5‰hI%%)x뭂z6UO<ߜ2,QsDxT*"iKTВ&Nd<s 4!C^~@X2;(a8dcSO2J{v0LJ(,$B&q4hPfNhk ,b)NImpƟƇ O͓ "@I D6NY"p˅|ȸܒٽ{uWm{N_lA MQA죽 ]n 5_U4O_wmR=6Ge?#OűFÇNjeS*ȌQ7i1Ի#Ee1 QMQEGQ1G/ߞZ:D!#2O[da,%z;dg$5G}HDKs$兿ݢ-K/o۴L>ŢMnSPzǽ-Rl:6aӐj4i@7 x~Y̮9vSNIdOh);Krk_B8Rr[@V Qd|Ţ{qw#IG&KG^c awŲWNkYL]RGw] ݨ78>np},l$3H%LF]#G>"~B-?C1_e3B~.ֶݔ b'T{TӕYvUءR;47W[=40>YqtbZ?z= OR_ .axAjG #c h->N/Ⱥ5'qHnfD躆 Ϳ/bR-(jJr X ^Rag) ' !5^t5Pʋ({k\\]\Ȓ9=`3hEdumD|΀"|l eYFWR82(}ЬگXlx%c5Ə P?/fO1S څUgܫgx}kڹ:(ɺf]"[~ D3 u>4*JhT1@ DzThg03:AVYv8z'+_/ ȷ'L ejnTt:_4lk]u 8DžTh#a"0Ǥq{M o!Xǻ5ə/牯bƠFՅy%k:6kbUV*Z5cRi" kF+ZZXkkbUV*ZZXkkbo $zU ^TjΊQR*xQ/U*xQ/EUS*xQ/EU ^T*xQ/: P*Cc` a:` e&w{Krj)(Y;׽NH]?PBDRO9BI@-|8X֗Y,$izFp/sr<$t4.һocstGzT5w, $xz+HAM=Pܝxmq$h$P/ @\C(18k;0^8 24-C`He;#=&H IJb-1n¢!DE$4A42p%P#x4k:d襽/+$Q>+58O!YO9`:@zd-:1$yPr.fu{嫌MIS(L`L(#LK"08&%ytT2"POMHƖTȩZ~웖7 L;!k+j8hPiv2~͗oFDǙm%LOPM17zV(OV 0P$TځhKDs s-,z?%꘴$uhWpi(کLv=eTVDU,W0ΌS[i;˕~|z@TŜmT$X*OenG Fimc_Pf*u1e޾Ew((rsoZ2b;o: 0E8UvhCgX˶GCѾj~U~4WZ"5kޘ$Ddt!oxul(BL.L(x'SRQ2MX Z1!MdijHbn $Tkä<MEbeb US(oIaU F! ȜLpRK~.)E0ܝ`h0TJ'C`Y X[1X NaRQͨ@H)C=x:$X1@JUR*MJkb,fk Sje DZT]8U]l=mpfv/W/ˋ7Q㸌v쩳NjB4.Y_;#Y>Fjzq(R*9V/zQUXO69{C7 J:d~"zEd((DԸ7"zzq*ꘊ>;PG1w4" f?RُVŇD۹5PVgBP^q2_O('(&('('h1h(h"9!N8sPF$~6|>lHI@![p4R(SF%W jO Rh)%5yJsy6qyyOy__˴t>S^~l1^}-{U rNp0ȰBPƦ\r*RF wE 6/|#4A)GgC|l❓R&9C_%LZ1ȎR 6Lq8 NR.%BGy \7;mZLӼg n:Oyϵ`JZA(9m ws LQkmHe` 6C! e&fNnK _c1HE"fDiDΈc$y_WUwW=:jھD\JLkɋ aZ]hCBb[-hr0z|V;`q>;]PI y(h*OT`d$)IET&fSĬBo_dv䱓m\S;*SaN1(;ih~ 5;tfI@}iB&W@UZ.!A1Lkyr;n܈+;QV./OYKgMT OOӓSӓ E%9bN*;w=.es35^wP>РUh毇Xy?83» 3 A|~mQ$9RU $0BjPn6 o7(Ai \r<{뫾in<0AlJq93lpvCiV3AN93O9&;>V<&^o\o&ң5Y%*QͶ0ށT=f䍫zqo0t<8l{6Sa]zѫR1.3HZr{tb2ϵ 6~>I2$JErkj*ml٫B$ǣNDF܅$N"q/Ԟh@ NPDp~W&Q)M 4BScr\0UPfؼ,95̠\Algcpv܇z]/&?=f˫9.+uŒrbg=9ooCl~_h cA#S`qEF(L,w%Fc0zBf*7 PFGpTxgU*0nK\j%X>b)ɠAUt|~oԳ8LfŶk_x_H6E~][,%*HPlLPLWb"Z++P>VQ J1Ml[]caga(S:*v /:أr S{L>X^^^k:0jZ~kCXo3XZ{+OA2/gO})O7p=^a< f_fQR&>y? k)pȍSq -cI1-I4Q'rC0vh@"4{My˓`%-Jۑ3-uʄIm ܝ%&Ȃ[MD1$*#5}{Mr~u_f 篛gzl`(Gg({4m BN,H`P7:W!FSA". ͙ęT{*E|&iC18KkRy=cM,oO㵛>|~KBqvzNV7mVnmlWt~/k-{YU{Y롵0{YTve5~/C,&8vƥzT梜ϣfgk\R2͢ØȈ<ՂWdV c(͟99p?קTjH4w.]iDXiQ4Ct-]!\jvBVtutŕRw8nX!+m+Dv+]2Tݡ+kHW *bNW1ҕԂq!vDuTm+D{1ҕp!BVv+thyAD){cJJu|Wuwp ]!ZNWҞ+cݡng62 ZfRc+ }Oi"ߔ=n&jkD:Uc&2|Xϧy|Z)j K|Rd+OڳQ̎]orڂ3LW2yXV VnYL~SLOWzʴ\s"ˈ6!r)6M,`c2:qwc:ycٝ;ӝRо;vL)]2C=bj)!vƵp}ZIZ@#t-pB$+Xw BZ=]!] ¹+ܱƺppEgnfm+@)Е$Z!r++;c "ZzcQڞ%;tp ]!ZNWc+M]25E[3tp]!ZztmxPm ~,]jq:)Bum+DiTOWGHWVf{MU+thm]퀒~eЕݱ0I]̈90]Ճ{Ҷ,nAWv=eZ"u*)0ʤۏtթ7M#`;CȮ4%=M!M3N+, ]!C'iFAZ>z" `͘o7C[ڽ6IRr+ʗu^Wk ]+J#ޟLe^ Hr?1Nqc( Dh,PGkmI˸ߏ-0}dpꎵQ$q߯dJ%Fnr< ["YU}ԂO[ DHFjyS3_lO wS<1 (GQϦtDu><<|\֤m=!rFq:Z oq2Z늣OC ~)'_tv;ml:xOx/l)*M֘h4 "rpIt_fG5?o64sBmuf >L~T6$ގ!wHz/^]QKZ-όgWd &6I5U;kP]\np;tՠP3%|> b*&~-G -"ʋHkR1[\,aME%&Sy]bNV;*Wxa=E{= 'E]wv؞ka1fhWas2xND2Lq>XWkm6\k p؆K犌Au߁2gha4pթǭA.Nohw|ժjJZYR&^N 쭑Gӏ\_ZzNoug}~~໾*$:V#Wb-Kq gכQFl!d6Ğ {Pqv:RRF|ocSzS֙U_֙4|(:ỔiSn扙kTINW"q^ jrλr,&ii&JzƵ&&%/&m.IRO'sLI4*d` O)XCP6MbTm1P#xA༷THY"V'PܡGgF1vzMmw|qW;:y7c8}κH2ۋYd:*oլCgt[57~rfچɜ>EfVSu͉Q ȍ[@~?lĉnj~b+ŨOy`w5vGh!?%y>VF,G͕$nf?=dΈ|]9';3ny?㿧tA^Y#.}]cu2P}X:W7h;aD^.O|/758}7gS@zU1пɱ_Ťsʎ ڐf)tiq!snj.Ri Z;MI:K$Q b$# CT˸<tݿ5]L6vߤw]ZӼ-sE ‹URHxP7BuӜKlJ&NOF P(M%5o@ܰS3R0{hUSr]ŸֱMGVȥ5rfrQo>Yߖߕc{.:E `Re@ن[5%A (!LJ-+ ٪Gnt . 0kS^f?պc]nd)T"PU9<~Q:SѰn)?VI\\$qAME/κ +"~ R%r!pL@X$*\ rFZI@"AIh;^͂f^@9zaiJh S$#]F Gۋs}mYYCVRRd %56e=|#-P-^!MhpdԂģӈݣEШILP'9Q>pij$EB[wϹ "6ziK@֕bBAS.HSK!:tc#l¢%DEGx )ф2.GB"xλ]G/jHDHj Y=/z0sGpI[yR.˳jy7 ] sQX`L(%VGmJVD4EѧJ]-bȹEwk?t u< /Jp6Dqz5Jfvx-JfnM%KE$G'琫G}e*qfzyVsW9\x8i]$Uxܵ)ZvY{kwٝU%W*!${\(hwR VBIyKi8YX6)s2xd޹J)ʜ,2)VŠUܒ椵q\ c@(-cNp D D v{`l'MҼ/Va]3,w`*emY hmN-@+lRNT:Ӏ*7g缼]'t1p7F:HcSuRQ[- yP_N pmq<54JS}ˇuM ѡPCdkV(P2Ĥ2w)RU!^d~๓]Z˚j{@8N琬eN x vI E+!%opHk",qm 4Rzx#+V85Sy+ҷ'Ӵu\ '؉R?ߚu%\L',8I?~q$ߧbǡFN:;fxg-L"M3Tr rWlp*~2Iŧdx9H_[jYUW%ȯ25=ׂ0b 7~6 ttvJ3(@[}L>4ީSqih_5,trrT|!;^La?8& }}zFQ߸4;}]n'ٸ~]|U05`.ᨾޭpv9RƣnMW8/"iiuoI[\ _ _dyBhAzG/zrr8}sֶ.:u}gfYV'O N=fq]mߟ?԰GqN=l8].7HG_kO=}Wo~Wo LSEay6>gMO7M +7inv-SvocrmHwO7Y1E<\A-l'E\|a_β3'E?A[ d#fF%<6_v;gjl$lJůFSR^_V80&TlYT|GMkto#ٰaAOh( ރTSs擔F'V+W%ń"Z+mgOg7W] ڜxb=?psvv*H^?Va; 1 {""UP*u6Qx:S[[JTY%JK%JF*"{"+Yi{ Wt"-PB*Hg@΢n{W?{'K^D` "%Drז) pQ)%M-Wkꨶ\ߕK^6HI|~*&_#L'$5p[Ʀ5s[V?gðʇzW|}|u=cn7rXb7t7%Lg׼kjcp!rf,pe12#Ua %g o%+~ ۪lN@q > ShzWW m< rlUx<ם[^Vz޲ܓei%*: OX6:B1#;İ::(]Ft7U""Vvd+09Z@RA6tEp ˅]+D@tutedYiW@eCWWd]ZycPҕX]!`u6tEp̅K`}>]!Ճ=]NWbt?NWJ3]Cҋ:jRn+C/[QVV3WO\s̜VvB]e^ŀqY0u şoyz<m^̿Jۂfӊko4i!&4(9sB K)+].thҕfDW'e6tEp9υڡ4|t<"*cZuB74]FtE$\1Hh] Q:{:@µ_ k+t.tEh-:]!Jz!VDWf3Hp5˅mwҕB:Ȉ7Кλ s=]}5t7z>`B  C+؞\G=ڡӮx =]=uA0r{ᱢ`Dpk6:E6Zg K\saiBdYQ,ĭ8L -f؋jq.Ն785(<PGimQ8gd: epp1:r+m_: ^_]Xa+.lуן@E>V,`W^7Y 'FHK1y?¿Y08ׯT)E#Qii\J!DUJt|KΜu \JծudPUe/`tF2X˽JK5Ȗ~3țr8=DK-Ңt6ceMy*u x1O9[q7d ʋI!O_t S+1 SC;%wM{<'  ]w.t%dtE(W tu8tƆ9m3KesA-}|ϧ+Biz:DR+2+lΆ&*ZuB;NJk͈\vEhm+DiV2tu8te`2+x6tpw"%"ZJޞvBWրt2#BCWeCWVs:]J{:@rF3HɆn>;su"׮0{V\,|;];//ZЕCu8ȪK~[8 0{Ҹvpͅ RB)#Nhg9) ]\#r+D %g=] ] v`\BFll+DkAu=] ]I'9c]5\:NWVt$+vOWCW݁hVgK0 ]ZyPу8 p͒[rz>\e!`+M6:UкΫjs{Um'.I&S0{ˁ /עu5Z@lvjHqȥlJ8k5`_ EZjL?չi3ʜG9jD_ۥ6%8ؼ6x3еBAvݳ@(EY8Dς4EmݩYthS]+BuOWHWkrҮ|;Z˻NWRJj*\Ft5$57 =] ]i!Dn#UDGh؞Z lg;:| e++ + ȅtItNveCWWd]ZNWYR0mVw8]S C;c+ՂTOWOz܉.u&DNPy*HW~+i͆"&vei]]oIv+ o;V*Yaa% XXYr$zv`{N̵ݒ(-Z%MTխsϭ͓DW> K.yc!t]zcI/h9Ej)tNc ]"]w.I]pTp_ ] ǿf7P2 ] ]9q /p9,x Pj%ʳDWv9K[R@ʠ #08~'Lt5wBW'HWKZ`)\Z ] JuutrzItWM9vp뾶}18P#޿Pa0Dq\祫gP꽻] ]=u5iw٬#ͯF3_h +?v֔ 5,]<y]cF`78,d&h%Nsb3,܆g7h9lj%cA!\YZpcX:}ia4pK &DW]\ ] }Cjϊ}=] {'ÄNP +v9t5:h0ʨN(z ٯ{\hu.z(:Aʐ7 +vf9kWSK~jZ*q*FS dCW@ (:D/C[\pw{߽(Ǭvh1hw'4mmڔ׷^^ ? 墽AP5.vYۇ} o?~~>L!ہ޾W*iE؈Jmoo_qb}FF?wG6kH_wߕަry?a_䋿40M黏dϧw;k̔;^ZJ ?|[BV{^TvTT1/@$Q1-87_gg|`0?} ~!ߠy tx?~k H7m]///B{Lf0ԼIz̪YɛJj}\<RIIbU1=HGQv_^wF0']G[tjRhѕ8:AOkc0m€qnNVCJaQU-1%!Bb ![뚯͸\"FUi,Κ-RS}HFu ^-XMɂ13g"S ]VikCvYwiJ|* HhG>rNEA)1ؽ؆AA[< w<աmӊ#8CQY5 (%ʭ:<ĪѕA[,<[]:A8քF_XLĭd BfCsduc]L\s |/ *`2lu`c$yc#sV9TMAPڑkj Vz2Lc9vx X+UlCD&0VjNddb!}AА$xp$8'X( r HM餻P UB $$I`U %5=೮ ++#q2FN[@( `-ePDdBE INbTy>MFݚa$$XqY' sG !.A DjĥđPI!0D5 <#3ص~.~+B4TCAN)(J892r$XT5Y{#%d~GBY_Qn[]T̬$u9kh 1 f1r 9lAB)Sl5R.!pPBY3'Y,k hM41# =IBufaP-ڻ^̈KUU+f"cEUIkB]yrfӮcYsޞ+cZNtkgg{]`ACZI|txK̓K<\J7#XLF IW=$ B+uT2: #'4X(hV(3{ByjN$\Pd"UB5 [ Pk4qǙy`4/! 2'Fe YN[[F#H܎m Y,TG7>/Xżsa¶VA;> Ddm~}uqǷlTrrѡ2&X6o,B&Ȉ` A]jN96hː9(P"Q).ѣB-CPGmwu $$hLEQ{X1z,2m*Zζ$v 䁀:u )+\3X-D7ft aroێ39dNԳ֞f+JOQ!,6\5󻎷4JCiBp~֘M5bUv4 J.ȎY1hBMK$Cl.Xv%a4 yFAyG5x#W3 ՠ n|ٯ>WWV0T-CkT2yMPk䜺.-E[]1>n9{O?PHnkæuYC%4.fW?V||ϗ;I`*54vvIcGZ~}uծI/>\j끒[' A@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $Nu nIN 2q^Z{(In{N $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@:]'3KrB'j'кw:E'Rp'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q(@x9N a1Nq(' :\@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $Nq}l~^j:Z^7nVovW&"䂌K1. a10ZVtƥF1.q铡 pZ ]z( ]n9B[ LWϭX-]6|k=EW+zГ{۳# ?>+4[Xg?:}BJ3gϔ]A꧿fwhEo^`pnZ; TJkw?{۸ap?B^@=ddNXYr$َs~U%Ӷ$K2="G`1,~X]}pJ4Du]i@kn;M#ʶei4[!BtGDJvm+@)IU"] A+Eg ZeLW Pb{:A[I:DW++K;c "ZnNWRNІvIBwvh-m+DIuOW'HWP]r`KBѮ-o Q,NVѕBt-mAD{:ER[;DWV"ȀpMg 7/2vBvѕٱQt`ˏq?1x=,c8AC-8 &cPw(I(WP_^*mVAH[|w|]݌`l/S\tfEtbzwejP]]x2Uue  t1(Q~U=~@G2=*ݳ.5@/|GE0P8 COkJ\UjPjP?~w2'DE.D "3L#ɼk|@/•tQX$M!5}!&EST޾;_1i_8Vl4x4Bp (1dWub}'I'?ZEW=)IͤB8 b.=c`B,]203&6U+&65&6'ibs> ka;CWW5t({iҕ~lѕЊ|@\tlv(JjK>C S&:CWWȮմt(JjI|WXu6]+Dk[WHWHú]`KjG3thۯ]!Jz:A2xs+ ]2Em `֡/Wuft(9/]o,TP̏h?ȁ =֦Pѕ݃lOWv=R%|0MF.=14|>F!4+4hz{iQ* 4'T!tGخt(kЕ R!FµѮt(JR+̏~~pmg|vzt(yo "])L#:DW>tp ]!ZzԪ+M |-]iK;B¶ej?I2gnaѮnw "ZDeOW'HW *Ȁmw.#+thj;]!JAzR g0-]Vpq7Vվ'JծLNW=]SJI et(JpQ٦h~xꂒ3epC4>6ֆ|hٯb-WAyGhVK֒s~WcxfmC՝:x@˞lI9e^kcJê-ow(2rsQ_\.9Mr<\._}l}Q^Tr}?$Xߍ?NCWf?Rh^~Agoyx~@7"M7ް;ތVxm[+1lcF~OޭT3+j ͸j8\ Ki'| rXbKXi{,Ǩ]nϥT"ëz~qh?)nn'Ẹg =tLDsJ>Kl"}҂6*b"AlTSsU}J0?|=Ôp õ`\\x8 yyy|3_.Sm Q,*PGtVeP;X-"z6OC#<t3bb/Oݸ԰`'@(l1SaY.o=n;3fQŅMn,KVo' r%MAi8+;t#BXy{P zx7z*?GOonm~|;9bjNU^:ûa2 n~BL x1F#L> JhgK-@l~mhq1x@A|~=8?L^X-9/T&7)hVv|_w^ ^/ .aN.AxjVi>jXt<:.>2"1P}*c4RU6Wq{i@YxH{tBP +,Ri&W(KV jf0z/5QMc[]lu:ݱy0W~Ɯ,q='̤5IF xzcS(:iczQR)n"jS^Pcp[*D,ΔH]r*Ҟݑh m3xo |z_j뵦0T}&ɵF%/e2[z pY/ov?tJV Uxwѓy)ZmcVvl>9+*BEOfv0!V?r[|2#dQ^$ji1G%C60%מI24=# #XqAp*`!I<>Ĩ12p#Ye%{ix&eg#1)y'""vDZr oL16D3O#̃ #l#ψ}H*rspJq`Gh 0aJTibHR7t~oU svW @5m lH_jqWUN!1hzf^hg,b9o45^FI]􊉞MO?K?=ݺnoZ:b8[zG }uVƒ~׃W8SRj[fQeYZa6)],ľ'Fr%10L)(mSiDzنVm:Ԓ?qN22ghӋ$EeclZ*^ .9)?9~ee  R^/:"*vu؈}.^MFpDإǬm"ȃ&jDUL 3+ʪۛ+VD> JD $0/o`Y}B+%H^fucqtawذ\g/)L̳!OuZV1jlI!@_B]TLrP2To^/Wگ.#"=kS#nv9QӦ%nOb^~oI1a4 7Pv*=.Ƽ7r\Oo :ﶡv`ÆՕ0(w߮XEäږ︉q)bӬӥf#&P?Nr˕3?o_xvnӶ\8& vt:31HJ"921"ZbZPcqZ?>5pWx2VyVJF먣*T$?(1"« c1KWF}_wccoH{[ܸ 녭'vxb(a>ÑJhKцyY  uM<%!!XK8f[ؒoD="xFH48AQ@auiݓ'QLb  @\3ȅ^wϹL*&&-`--*H?SZ4S2MV,!*(y"hrMxT%PF"P6 %nP'5@MvR%C4F/"sIUQOAs(sJHF mB˄dbEH wo1"YBOJHkLXQ;fEMMbzJ4MKt}K\qg/qݰ̧_NG۠dh]=H{t-oLܩy-)*߼$4 ;*Xjd '9,}Uֺya[=ۓͲ͞0϶w~{AlǗ{gיLMHte >;/S*Ze\ bHx/@$ ¨:mnjmBbN!e;ffb;}AmP{`׍Lc%o@hlK%3i6!HFo4г'B<(ݻRiB&fȑIĂI8Fc \xؙ87ac/7" 0;ED1"DxXqe{l+0,+6{L* u/J*':EDk8cJ#I"p5H8J%-PĹ7QW gˌ}ZluǸ\pυŋdE%2if󉬢<[p/hԀ}FG_Pn(cіhWwb|Ԯey\ޏK~\[jkܵPR X)iSVq-ټ7ϷoM咷oJQnHJkc7I/)DS nA_/|Dž hҁBC'ClHsWiM׶3ﶓWeV:bT 6,EoS7?X&lvE-V-wɬjzTCԕ'2^E^^M{t%j "\V o}Wy %v{8 L%տMptH}.H]d9Grϖ=w޹gg^g(YkuL1%<-˸W@e1GpQw˲ns섍!gm`,xGSEP2H缼/'T B$j#4dLecQt%dɑ"$E@QݜUt$C˫6%+1!󔹕dc&-,&Ҍ$ )Fl6RU,Z7 g˝,^ydK K`9RldLƐNx{P0^fɝ[EW6ɍ}izYH&W1Pd@|PBR@vv{Ʋ4]8.|3TN_:4qY;POݿ_}ku)bvz²_9~M"T-qUE{8;SUW|40X4䬬f#qC@Y1L읱Ch!,׽ٜd/m8(ҨzcX^P u EMWk8k6+(|j@'Jysq=u1Q}nV5 \ѼO5(tpp"8Ts-ݟg{VWQhv2x9G%`cFg?Vϴj//KkY=9hq:1BÌ\ s⾌&y<:4W6NeMM[ߞ-7U4&l4iOWatvDG~x}_~~>||z_>}x8@VEAM~훶e`6M.z7k򑷴{Y}HycvVSn@Ͽ|7ߏfm1nQ6Z.fv'- Mj~fUXܢO\_8 l@V84/Hxl udrZhMF"lt ~%7ȶD΢(H$Idk`#=bF{~ӂz/N-x068- J0#ٍJHέOH޹_7{:4??K|PgcAwtiyU.zUhV n;cn7\F>bnzh.)Қ@Z)Dgyo#D:3!+YB5丕աdc jf~|oDZAyA UҤueLsBL^Um~:>児m}7ON?O)~;⮠+gpF2g7omN,עv*}N0d'֗^"f9_o&jEm5ڡr[t:û ?PgzPb;zq2k,-qY4/;qiC}P5&)j@")͔U P V]\޽zR-@/R-k*:Ƥ &nj Dv : $aF1YQta Rr ]RdYJ_ lF2f(`{n1! 5/4 (w& Bi%eEY]R=*\;Ix4/#lb>D Pۡ Eo}'2qٕ $-}@Xl߇ ,!Zv_.ޅ#} Kǣ#Gڄ_:*xj Q]䩶F9^v7,zh%ڝϏksޓ&(%"aQ(k (4T 4 HL.3DoDVT2{Uru%@ggO$ 8I.dPHr`]L5Jxzz:VvVŮ*p\mS%9B3-))Q?[] u(Vqm,4D+H ^k$𮭋; `T(Ẃu\ThqSݝ Y- yMq Nz&EЏ9:6ݘ%MkOQ+j0Q,kHY^! *:,У1hs%3l# 9{iL`]p#W(V>ҵ^hdo.5{MhcO?SՓ@}Hvpۤ^Ơ\nj6jEdʊ%5мF4ߥ8x$80@++B*+B*+B?*+N<=-O 3X0cHWAGbp`x qz/QİL=_Օ(>t12YNq.|6:x,~ILZaK*)|J5fP!F)cwB#,'AHR$J:A:\b"E.9C׆^gܬ+ֵH+`u/@vq<:ߪk4ЉNu1i}pCMUog 3ٮ P hv&R`W")T\"^OxUXꝁ"WEZFpiWWZ bFvm]"cSׇUR^!\%vUVWE\wHZXHc5•vQZoG6;kWE\)wFHJ\:\[zHg; >3\M\$Vd MJ+Wp爜_Tkۙ뮭 js~1Eٻq$t{w 4~jfjj^vj @7呔<_l%Yl*ff$ 5@?턶wW/ uo_di^Y6%$V?X|m!dPkTPJZ; ORQ-t&7m0/m&_]񻳵C~Tgk QC$ȥ(>idMNqt%2P4WX`@FkOdY\#䩀lV顃lrm'ٷjkvҝ \UqDUZWUJ#\AҕgZ\Q<TJJWoX;\a?慠kO~K9^jvh>lg7a:j'[U|λ*kKھ?JHF m nlQCe|LrwϸYtYnoXcU_o_&/?ן/-&7{ >ׅEg3޺S*E |#%J9)H=$dD+:`!jxo*şޝ_l UnfwK@  ][}9n ~:f>*=Px(SM VϡG8T=#AХ|M7ty{(y$JIHFfG,3W'ƌ1l R0ؿK>_|ip}Jm4{ʍbhKtŊpld8F>b%{%(a2Y+3'/by:J qS\6琳A&ᢓ>gZH)YBD Pb*gl3Xo|zzt}Dvnbv!o2[n9px<jt9cJ.C/g1u>幽*l>y >tbTMW7|u{fqq^v zwv渫˸i _,ua=oFxu=F^r>^n1zV|Vz Gd}t5إ_]mo3U<5yV̍\/Jrmȵm_)6F쒍"uxfȷ[9 )=W&Jb3t!iE0ʸ" /t69IQEH]WfPp4% u>8Xh]n~>/|$V騳2[*d0%ڣQq3P5pDT!NdT֔ƘwN%겼h(YY8񑴉-D0Au8"f9^ʫˌ=tJCnh-vӻ%˕8../ <2$]*Y&FtbH:ވ)?RL18O?35gzF1 b:9Ǡ DiJPVD] DiDZ\U(v\8_9N8&B8%vz2)z̪.0EEF`: 1~ t;ka}8%ԓ#˸[UB/Հ};X}3+MG:wUrm{O%?aئbmT7ٿ~?R?/˒pM<#䬿_'(ema k"c~;[@.eBFT#݈%dzRK*e~;./kM(z8C5]#4ⷳ_uK[6gZW]rj9  o{w9 kp,Ou}{M)~xE:1L<#eאGPMu>is۴鮁7oWYFB֌m.*m.OUW @Tg6#pCVJ&lvn]׻Ñ-5px!ʠ|%ח3c",di[Er;])6G ,ҲeU٣n0Yj!AUݳf5ēu ¸Frɭr|s_f4nTmҢ.k V'm J(e6S tNo+8iǵ^g1_nNOT=l9~kJ*;o)$c $j*Nu4hE U LC37DN.*ދ^T}iiƞ8qc~ꄫJձ&jnͪaɲ0>%a5jKE'\&X|dez|_,y]'n6GT'Y|(mVGxxEvX]c!aZij>y]zLŨ ҖMQT`vнs 1p4!ǚo{7wֽ6w /$4E 6 H9"ușeҢ!4F1Ti" ,D| WGe") @(60d`oOM![[$%,ڽt,Nм#!SwC=Rt1r]Jd Z%3O:yvR9X]"QeK6$R IH*)miO:\T k}$e!/Hl+.h)%dHW%&$Pd\2_;,M5  B$b&BaP NU:66$`=A?6l|8ROxnuSdcӜj*P=?mM7}X]Vݾز4^섻~R7mL= jel09&*,=Ҙo[<䃞;33Zφglt)/QV~M.~B/YɮgLVjHeEІ̬2QGJz||_P;@JDa)F$V:i"ycmȅ`FIiBy,x`, @: J/y7 lL}Wo@ ‘Z8hX̚?~ C`#NK#VV5hQ^~ѧK1 AFFA$"WrC0(E ا>=$m mPhJb"G"ZvLZRXmOJU? ^ 5k19b0lŧ{;ʡJud#}"EkK=jo6rD6SvKvvv(4PhݰOoo&]P[[})X'm.m;ڑZJ;oϟJ%]{K g$/md2xi -$(M-cBlQ8 < 8*JCON.9T| \):LFD T؛8{}G ӗfz 0#OzG9\=evqwfvW WWWw،8vN2g-jG!ʞ%8\':oM P!:(Ul"ilHh+L4;)wA;zBgcE0@  ba/C$Q$]݊.,#I$] )H:hdU8t0Ad2@% \RRほ4O7+m~:맷$}B-=L=-7O'rޫ>t`;N课%zNHѩӏk22ɟȧ{v{K)&X`TEk49ݮFL`'.Fž&ϓ}*Nch ~CTw^7^\O/ gl 1 )^seY[cIi^ د~W!IGAN+PC7cDkHxo? ) %|#Z;*p9F&KqH8Hl4?)0S{W(Iy0[(b 7jcm2Ӵs,Κ$LGr5(7޴6+;O`UʛmqYy$ )W6ڥf-\YR)c6(Z8S#קY]fxc\Gijb4`J!Znu+:mPRDzsu{soz\rUC7' Pvlۀh&&VQpK!3jg>Lg,^1Tk%ik!ںJn`0ˢ0ՎeH]}mHh(|63M'In >xv nꋹy?8~D|"#Ż2#gw0#mflzpcFAz(VonRs^5dd"jK/ VYa[Dx ,=Sf+}.`^f"=I_})[* ''p嵙`(zpMnG0=o a;rzǗQn|)EN@(t0s9 r*?99^0_NDxpoD w1հ0ponuC&oKsw~UD&FwP]h 9{cR5.b$J{IqEh<ԋ K_'m>1[L`Rd;*qmFMւi$YzD'2$udMdMIim (21 FyD0gT *H-R*#'cB)7j<81^3E2w9lVJX+/mhFZ^hk YLhr<ۋt[ϳGqK yGG k ~VD)  KRO+L!6>[;o2^z'PDu4ˬȬHE%Ng #X-Enk'ҽO'ڍlx_\ٱ3vr&Z>OtaY~YLfRɴ4KKcE'qʘ&F=(#*'ʊt*+=H")$EТe%)3%bz3 Tzȱ$/ISFk,Shn ~?\U鼣"3+kkCtJSka 93VKQg2dcם"p$J7dbi1g'Y̼vZ#gWLR' Q:0epZ+ >#p#]LZt-tEԛ>K6[czErqܔr;yZ&g Z%.]D!d<0|'w1"Z[p6v~=dbT< W:PL~.0e yFv Ec_\v BKQ?[_K.Y\yVSy$[z!|F[c/EuDyf>Ǘj5Yncj6s쬅Y $eTKR1Tp464iLjCX ~­4 @?[ݤ@s&>u"^L׼g%SYseB*%O5(+H*0g6Zu [i&R=7TRMr [I}0lp V~UAZ*ެR 'W贉O?<uW(SKΔQZnŴ%?m .2yB*sQۡ"oːȟ Ʃ$3ӨV".asEbrowX$"䗿 |kBY'yU0 k0σYxA5n]zurk͔Z%hq5R/ tj??&~wU3b{i,Pn%~d+aRsֹ<9dMߘQr&C`RoNN-I[gߦ˂,i浨7nIgj8ݾpTLo4 ҥ5m,j hI:;X1BM]ʼ5y%&F-om;hirtfj5\5ZjTm{.ѣ0"Z󥌬Ҳ5F^aXcFx}K3U6.7w}jѾ0iL^./ݾfrwBzyDZf煌02Wљ. rV:䷶և|qW|~ʌ4DKfDaZfF(tPKfdim -4TډFh'\0a-ߊ-SЭob0 oӈHwF4 ldDk'7ۙsRbeDEP3[шrNŰX׈h#km1o|)mf}/}3]Î6D;xaN Zg|wx/pS.{;K6n@;1}8;* JH܅^WIȩ]܌U *ޯ* d* X%aө36HN&'e!;mhfFi~ a~x(7=?1^NoR|9qGVj:Ok:ٺVR22EjlfВeקeDz BV޲-s+mVP9*tE aq 3w>P%Rp&<ᠩja9$9HMLc1|J!>ab kQFZa.(=Fe@I砧&fSMQDoM8g5+ƕscgUgN9։#&AY ,b ID !nN$\H  A2 ƨUŜPFYJy>%RkY˲9 /݅ GI#IɈ !4 reWDD exx VH Mh|CBUƧΪl86np.uXea,%Z,92lH%3c2ko24V_Tn,W9c1'QI'D@0p2!p9d%s>L6H|(c{$c1ܧ0WG}eJ'Wڊ ~WYkpeFh~#e{8ܑ_* "Džop"ƹl}b 9]z54E8T!R M-%& gJ{9_uIyx1`w<%)R&> !JRIvEXyTD{{R$r{Γ{xfW.;zߝ_öYЃv.< 'e5 mO)2uNIy&mL$ 2ӽԞf8oH$y=օū/.>} zh%9qXIFMuC.J!9mυ_ΆZQl2 ۋk]}Ѵ2<]ot\b[w vp<;)&MmO XIm};5OK#І@ bS(u b8agjΔݻ3Gagq>*EgRFNA(Q(QeuҘ=QQy$A M>gMEYa#.3D5DR1K %I_w&J {n4(eJ`գoٸ"JYlr)!RQFDSr2h$T'V$X  b2bY$o)ڳ( 26uvEu3,&ݘ?@l})& %Ӄ4?o^Z F*^ dqt7CsN œy8w0jU̼]Z㗭O~l?|y~.03VXvOd||[psMgӟΫaN$OBt+W[BӮfTw3]a6鬵ON>o&zqtip ;[UV7jkұ۶^ё0u6E]Niy)[ۭҹ8dy*x!yX&y=[#㍥&"8=$*kn&KJkE5) &%=%uh)!ո-X>'ixhۋ<ݡJO;?>  9#A2Z[ z'30eR$ad Jh_ve,\4\9ɣ $V>Q8FOtׄ[ht<ɭ^v=ճu~:8pP׾_7RR⮁/0#}W//Ʈa6Wˇ źפKyXyP&ݬ7>'v`+;ӶPԜUC4]rԈIݛO {q^1@B fR@8TV+1{ce c* 2*klIt&v( yfeA ڢjѤ"Gp>'QBu@R,Q鬵 U (֝lKz<8U=}78M{zj/RD7uy,\.b>$M_^b/\|>Mco=AFd+Ԩdnc~s{?EL $(S ] 2K e*D=wU$GTt ҆}n:@TN3U[ҦEgoݹc<؜g>+|a݉/}͖u3ty8Ɵ/OV׬׼'G7?rb "L@Ȣ)2֔%XS K`X\lNg'fr+}dw%dʖ%\QYJ}Wq;k8|6[a^~ sUn^10gbZYu]z ߎ.k: Yf쌷f< G#g/ogե ?5@=}@ /F -<'%kaVcOɩ>Bh\ yjOQB4~U!9i^BpΣO9_!v B_L{)^r3L?&6>[V [4~ NtJOv6 0l3a$ ~!:(%X0L {4BBZWҍo,gKVpl}%Lv\'~n&~.@,6ӳSokhdh)JIL8afoaտj༶1) 8#0+Rʔ$j3{afk9Wqj}_kRitf6E^K [BzbM XG}J09d5QiU:&dh> (bE+֑ MJK֝ەxɟר(sՎLɀSw.%xtMp?gнxѪM-/sֵv72ڞզ \bDY 0bMIɂE2U R aUSi௯#J@!x |Z >N^XVR:@gT(km#O;X 0`q7~z&8Wr?VwzX--m .,"T<їxV}JT2XUG*tea.:)2Ԕ5CM֔$:H2doci,@)20l(&cp :g4V*Klr=3xR9* 3PO>bG3EФjd6kƕ=ޝo7]/UfvgOŚf6%mvvOzru]tKS^Vf9=+ 3z>e;GW8~2!RQ&Fh8b%˽d"£(8@':1k䐆x4zS:/D1_,mO[j[KniL?}re,d/Py mK7޲kȔf;TM6pc8fQIH!1=AЭ4z Et< D*긲ƺ@c7fA 7c s"IhZ.Xl'\ pcFsܑF99x9y7Rej)IJ/ MEIQQL_O"0&?B\i EMm r+(R"a.ky<̀캑^7aV>ؗAzG~a4֯5`qeS]g]7 ozN9߮}戩}Y]$ErdܴPe0zruU ie.Boʹ% kML[23ϻ(cGh50ʦrcG6<"7}[&~˽gJ9'lvnV.tnwdKM.=?"2G./ m5(:X.BNr2쎴5ĘU$~ҹlmje^1AazTW1mM>kĤ!N쥽Yq3'`)Xr3%bCd+B3{XJG.RV*Pah̓!jPCHzvd9:*p /?Y@gQ3J I9x(NFəHlPzE^@>VxگAbT&^*\0c N{X.ӧ ͠\qa$`Na+2L995Tfn9K CWذU|9X'Q6;{XŚ+}遼7xG9TFb#Q/|~`ƲkO~v<^~^x{g[=5oyN<{w;L&Ѽ6߼}Faޙ?]g1=%R~d{\5(yge uhRU~4|Q' `?[ j4۳&>h?VLZ3|Bpl|6(Į>gKVeӚʃOnL/:,Ns/S3Ȋj*nu" wpqK'{Eoci,@9#F&Q1@D`b s[ p&Q X `s7f̓QY8|Xgm駈>^M.LTNo~ݝҠo#r c!b$vA݅\]4[(ƍSqɭ5fCh7g={l?usGWP%;gι]&mzQ^ .g;4r Цɾt푈ߧc7ޭFAOsدIP8dK~)25․@ckdr?oFuRwדm_B|r179q頭a_neE6QC>qN:g]7Fu*yڇW=B/,n$-|Y׍oxstϓ&v63GL:"$b2BnZR:}w=G7uZY{&x3mIBZ3ƴL. zZM!^{( ȍ7_gDlxj-/n mb+FHhbwaDu#[pur==߆sH=R &>ܓ1Adp) ﳦ#3%ՕVJ*)y%mƖ}f|UlrC&h hϦ oٳ{)݆"[V@keNQ-UM&-[$[{n]񐝧%$x/(f]޻`o~ȒjodsНtA v榴鼾vS1E{U8\eSjCu|q^GЭ2.Wc(C]^,[u}' [kmR[>/f^=пf59zio{VI}f#_l1X˒qj{_jhއ& ^W0jϖ 0|B _wnp-*%`1 |dBhsgD@оP=:j' 9X]Jzޓ8yy6+d/ڱoK 9ٰ\/_\>tińrvqiAh#<*% PGtf{E8 8ou(!Ǜ5ɉK1}ڳް y&FjHq 7VB`T>֕1!qBK42*)rveizwk#j!J5(~2eWĊ"Q Ԕ54,#XJQ hwJ͉IB ]4NLh*#PJ '#sgՎ1eT\о p;=;mzD^@ʜ\3?|.~6b ~tG;1U bEg䖑lG,Ý r|6aa nl)޼prDĻ'Ge:!IAv *ŵ",D޸nY^%v6[׮ݦ[A^8A A]I@`"EɟR>tNس^G aBtw)n˦4:/ݔ&G5FGjGs΋gjhtɭ^wŻxi^Tٷ\g_sә̟: ]ôeDmF/lq^9q '+\sYT3ڌEYow^t1U/MEKՀbfw|6so%KŽt)Bf::0BJԞHU[ ZUARDt"t± (Gs 1z@$EmU2ZPDF~FZwgpJݧMϞ0yc%ɘ>GmuݥMTz,(oH-$;QJ**`3 <P|"6?k2*ad%RiD¥3qWX3eXJJ,uw6NTxۍ|z:;6%jBxdywjg#}O%.S|#{dzsA2d6d2x0!Ȍl d.et0Sh-/ O.M%(sJ`)0*h Tז3qaؓޜlMY5?xGVdKs}lGݝ GGOG7nZ6A D[TGL-Q*%{mQwecPYlHl϶o'/=!#DE$:؝sŎqp1b;}8X^c\PtJ&xlk#6 RXTDR҂UYς vv$hcCNȲ$$]2 FsY$ ^{g{ؙ8֩_bU؛m-""J mV}U)Jd_"{t즂!ۂ5Dm f07& |1T"ۙcrH '-Ѫ`ĹADlg+>uv6Jnkmv vqWx(xP MTce/g v{=`®_s䵫Ob sTnP n~|Gr,!Vѳ_(MmE eu?_<[зK^MC^ <}bbuixjmڝ?Ѳd--n|Ur5V/juvt<[_-jNC}Qa7k/'3td:vy:ZBzJ΃[]v*IM:϶UtG:6pӜOu ڪ W, ˁh/ebT4*8a\T !-"} [[ަ;-ɺ߸[.-i1IˠbKYCd\g!2)gD,E0҃*Q+it6fT,!$@g\&N+έmO4vZW5ym#;㋋z?w&?.hodEV$o4$ wbv`bښeu6`YRvK-.#˧i"\ Ơk9/ [< lw߬/hl>5ZwQBPJcn"D&@<#FxO%InHEA˔=*ѧXEa'1}xdqn^~etv6|aI-fo4݁XV嗜o qy8pvh8AGVOђZ6NG4 o}(Ii0T$|uM&Nn@IFBE@ 5 1x U2NKE[;J*zݱ vIs?ȱ"1ys.o.KMem"{$ǿzBU`' h5VJi@jxA=<-4j?K ʀQ59& iٸt#Xa<4<^ %[%[^0j8L&!tYWrA%2{(T#Ȣ¹"+lfk)A*CR>PEXK}A0AIs.ҲQ؎Ĺ~߂-tsBiquc=YS܇:a}΁q| #yZbtS2Xa)ՠm=t1%&l?(o%@,ֲX Kn2"IИx7@9[k$Onq Q8wΩsFE@:|ȹN?'c)uH*mjBom8QJI]hH@ Y}2dU(*R6n*quTL CXD|>3GZYD6 }lRlzª5s6|_ތ8@:Njj2fm.Q<=<ʋ%=qnSϬF8v OrDg@0q<:ۛёoV;#( | Fy›mbE:PVgo2]A4Wue2QԮNIhH{E^~~P< Tn]M=/P71Ny8郦aT}63--.KkkDM`E'^ s>E*޷#dMzoݹG4~Bhis^n w5#w7#w79XްfOZ=X{٪gmίz8;[VW] MߺY]ܧԆ5LSF2fc`4Yk/oj75<&xqzOx8ѻ_w^HzKq=0Tv`,/wE/o޴]5VoҴ~}N.yGՇ ϟufzmHO?g UvǣnϚZT@xNWXlVgz_1}h,VHXl6 ,a_>?3FM8X.v"B!ld,O&5 ]Iq+f9vsMݼ|fNnbDnV܌ gr6=]%w 69MPp4qsV~@yo4QbCWg@{ Ώt5wG2ij+vp ߿Z嗡Pٳ*]ɑ:ֳft5 ] ABW@KFJ#] ]WKKW,NgCW.ǹ@@y#]]9DnFtlj}ۡ彧r^:+Z`|.S򧤯ԚOޢ1ҌhU􀫳Q@~>@o7iдh0{EV%\ju{OW%#] ]yd |.13v.t5@HWHW~i3+gCWs6ҾP!Ujsp\juajHWHW17KWL3pl hϧhDt9_K =`^hv(M]-C?p}Ğ6O٪]*~Le 0ˇAlQ^?j%{och8y6Vy.Oy|i`YXi6{ n.{lG/fyŦCՏ-3f$̆\ ]{{Ci㑮، 38̇Y3h_j+Pꑮ=3gOY݁JΟy&@}|&h@ WgCm{Jrio9p ] <Z &Z- M[.tL7nEPzP>L @/#_tykD| w7Ẃ]o~lGlXe*g7Mq(k?}%3|xO G)O%t>N=yLɪ5'gH-?ׄ񉪺Ľx,lrTT ;e"3p@O| }iysV-z#A=}m\[~-jE\̯ v&["=Ij[)[#{gry[t&x+g^>Gn֐C|:~xv7f_@30PWwz l1&kzVf59&qT58g ,VA<'?ԬrlCsƬ]hSqΤjPHH4\TGj>cgTg[!F5짅UΗZY[7a"ƜU(uf+jACh-` DL*FZ027KhFڭզSt)JOEm\}$T,.Rj&絋(WيPRҔ4˩cDN"=XzBd0v"oИ] ^3[.\L ZՏמ၈fLjaRMU.f2jm tDeC4QEtJds*0A?@PB. jA{]G'ZQGuh"?ߦy*&!KEZڋ Q1'!g7|ZXS>>7'!ؑUŪiJj]E,x4lhb0{\6'2g@xבB6]aV:z7EߢdSsBRRHqu~BOk"dK^LBa-4. X %EDM:qo"$s|m$Q-\"FSy,zvZ+6US}dh䂷=u5k)90f֬Qd|` :`=kBўZKͶnG0LcK&)eƻP4Z }d0.fg {%ؠlz< wԡkӊ#q4NwJV*o\ѕ8YqJvtp E_X$R B2 ;XWek.oeB`u.cIFEi1Q k9rBESAPh5 I[E\N=Q)S` t`[Uȧҥ:•XeȘbP$xp$۸K +Qs (dPJ&TW!zǒQB1kr2*zbN,rzg[R ++0ȸAæz+chw AH( ""*ZHcg;w,Lu>)÷L ƒ& cBl  ` 7R(ppR 3kK@4`_ w[:*zYi{s 6ݙ(J8E5n,\ BAy=FAQQ`63 XՄIhNUDI)bm+էDϨ`[ь*5DrY[&% p`l,@u#b C$] 0E M5VczhMA |f*0(WcrnvӋqicƬsDqr|1&PTԔ!N"/J~G<ئ3¢iW1Hgy7;_ zY.0AICk!0 ƛAy(KJm9@G1+$_{HVp2:bQt(,,tF\xϰ) >@&52*/dx&㐉[CJbvc&Țݗ)A PG-C@8jK W;US UL?/ZD"4& ld-ZaATD4Xmuq~}wu4DC.6TFeȾ1 ]{ #%DA]ZM9tSȋxC*%z.x З:8zv]:| $$hL) Xi q[GٕdЎa:<[^AWHPhPS`yeQ `roٻ3 <^ DƤ td!ni6)I@3&J/1A2踗8C VAM+<\d?Vzѝʰ&njp"2nPB`ƖvԚ^Mπct=lC$pȂJvrܭKo4^p$XHȨ; -09/^q6U+qaHj2J@Kyby]WtX^>x63I26۪ ..n҅\lѓqEEH}nwK(dJY:Z- ݚ k)DK$dQzRh4vMfcVv@ұ.DS>%7 n!JcAcOGA$eچC"P2pۂ @iHu6hKMA[6`tDr&Y$p B堷$ktἌ)wtH+Zw"|0~GW ^i#i=ٲ @"_ KsBYRQNt͖+N?kV2B)**sE>67/zMڛ&֥h72*MڕLҎџf6K'P8r:YTH2̬lW,-)eo:4&K g)Y$Kɢo2UԛQI;llN/A޺lⲸt1tY4{oj(\bJ&~z  PN$O \>lnKIhoKFJ kH6J `8aP tDJ qD%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J Tx@:9)et'PW(PWJP tJ k@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JoG tskޭ-^:M[MSP今1,}E5.w@Ƅyv^!|yf~p}^~hy~(pI!\(\zhS͈0LU+H.tЪUBi s]%VgCWWlUB4B=r8Jm>`ړEZ-NW UHWGHWjcUFtUkd.t%!ttP 4fS?=pxike mvT[ iVA& "G/_~ϖm9|#!FԞ%Ddo?8I5sZbA M'B Ci@I zHӊIEtFtR-1\cu.t&_e<]J4'" Ui.tJAѻ:F2+]%6Z\*5+ UYbHp-˅-J({ӗHWӕٱ D秫}~p3~hk~(+]V=R `^%Uu.t*6tJ( A:F2\,ǓfiXiOM<%}x1 'PPQ8MB_l$+-}1gh꫏ʶJ#u%-߅݋dD cn:[,tٳX/PqU,e:0u)|Rg/0..Ojef(E|('t}YjZ%n4s:F'7ع__ YG<.#u)*bK/-k>hU˜CLdPPZ-EYÜ6AJ5[Rhw]l1mkRr})Uu.ӽ9B?fġ<|퇸C xLdc|Tܥ tNQJ)Tg =A2aiF \)s Zmb'Cta$!6O|\c &WdCW C+@i.`#]2+lm> Bjbx.t%rt(9A(JiB *9)P)I T)m:ڭ -YmJC 2tBkOӀRkci9md*,RƄ"]!]i8͈$9;KM6-gtt}䐮4gs(Jp ɅW=Jtuteӕ4#JW|5r aꛡ+c*>ѽg?B|/탖QBteZF`WQfCWQ6tbh+HWOBWyFt-\ 2cNWRHtutŭ$`0*JpȅdJ Ҳ7w5b$4@8RMܺ;vl[tJ,-,4"J‡ҀRi,}, lɈ` "lگ|xJ(@:BD ]%%& *>@i0#]ʄiKb\*vtP6!]]Y*-ɻT1ƳW\*Ճ_aH(-zW]!T}JwB!4Ϻl_Z?yA9jOx~JvUZT=e\=Tހʆ*Jh-:]JC51ҕ!p}i1Jw9FY\߰o۴dqҞ-1-v:[S䒠}+ Vr.}50"X6o76ZB=저<]mHڿ󚙮=kKnY y7o@tt`ߥO.ơ!*8]RPJ+-DtK7 b&{C6Mq_juVvѾ^Na[ඇ7ܙ p=oߚ|L!_b2nzH~ÍiwKY_ql;>|WsؗXnk>y؊~ǵj>.gSz8ў$eLUϡ_SolpxJ lP'm~ZJ: FXxy:KqBJh]9&HbT7^PC強THYMBYJ{v-d8uߚx}h{- j?ivyaՋFݗn_&?һC[ϯg^ͯ`=w[+8Ūnz׮WeP&a<&q7gw7=;_66ԶilgwB[̪kc[M'`U v C KH?.w.1NԥNl BR]k]{&ƻ ,RVOXM,|_/F{v1܅mqD9b_?~/\s>u4W_%&,)^JFXEWUAe%\?0E_R9|NC-;ʜPN(]M}w|}~֍ ?ۧpDaw?~b=FnM抗%TQt11.d>F6/LߴUB d/& $LVV&F+(0t\/yLMGV ~ ff)Y$Cɢo2U_G;*>IbG V# _uWqY\Bd9lr:]4m f;iĔLB#bs8-'f p1,'| BS-35J]N{)8͐?Ģ<nJA.%5E_CFqbH'&{Sal"uI˳b0QFgI@M?m+vB328DZU AOD&kKDTz93ت2N2< S E95>DWZ.)932pREGjU+s͗gg+ |s#e &VLB Q"j#j#$T^/q2ż2!2A'뮚v#33G3 |\Uw TQ}P14uBvp("u4J)zfb;zݵgw?8$g>P{9E~y&^$N/j]7v1C[7~N[͓"&G.j|lž[6& b26dAɶ }QUjr虋Ͷ6CznjUVDn^r&d~͂d[Bk,HfHq…ʴxzو ᫏|T`op({ls5Vpٕy{#OgH̬07ޝG]Oҵqٻ8r$+_ #x1؇އ,v0؝y`x ˒J۽`]J%ʬL2xN0;Z}Q7]9a\~^mv`LQ*Noqe=vY{_v.O/]=7f.ÁMXE71(Cyv_zώ?ݡՉ6iU.frvA䜲{- ͗dUsߢ'xQ߈6#z A՝[sz uY)q/ 5.6;Q誮%[7YWTz,AM.{%/9SD,^esd>Rp46rY[AG0[d٢j&aO~ƍ9.YmM)s\Mz G^j-{I4|^HÇJ_~1b">bF4B=ZttUje~ƨ /)\ #(ci3RQ*g,oܨ>t?4AU454eh ;&ąd!@;Q0 /QRT&J;pj@]Mv1SC}ǭFXFXz5~cj +TXy4A7l"M$<֤28=b +G*y]ʦlkr"3hA'7Ŏ1E}2ZAx|Us9;+o:MFLod 5Ks;lXG{]UPPfel5:x0&ʓdJd\3 Uo3J(M&GK-·(@)kՐP+p xCkh7M1MY!aS(Qt/FMDFYm2I.jJAU)NExhKX:΂|Ҍ;ɢ ! 6!4 6CL:KxC d*g 2.E"^ fZt {fZ(i(Xí[aQpOxf ̽L`(n:Sw=z8#suG,/E>[F-/q!a/!hj@"zDj-UHYE ɧG!nC7Kލkɿ ~IY8G'gᗣ/Bkqh磣sG|6?xzt h%gpk=n|;s]tN iWj-u2zcޥtT/*]sT3OoC耡^+eP`(gPJBn; C蚙leK^TeNb6F U֬ir$LA 1E9et 9ljKB0RD]U)U0@l ELg]ҽc1}SɦHP}VH)uŗP9j&Bжl*ZMJ 2T:QX5d`w[,cL2I6$V D%9|E|={w;䶐F^V(\,c] `嗨 'ЋrkI% ]=IyR hq=}J I׊q`M UTo͜=c?үH '_ٕdeC[җ?g`̿|_0ē/'cI1lM 4, DLQKz0-A'e#XĐmt( S>TQS檲=v7slP̱xnc,^'=!{ӓd+NLQ6Y)0i(LJwF@1Bv  d]0dA¢a'V p%/T*A9_b58^c_H="MqbrFRJ949HRFaZBM脦emQPO,ط1Su(ܘ jPȢTRȖ-ܓy(LZ3frnQ'bFɾ~uv_oăve@QH;啼XXQ1P[aj/cFX=a_F_6+֨ܐx! n~|H]e?,{,Z ϣ7lhгlʳ^<7YʷS~y")ʭ8?;>`ԲA ˙ǏmY3~$kliV  ˬ2b٢W,z1p9W"ͻe~ᛟVI¢ C^h0O,P%ʱfԛ].r?7_cuR-|]_׻UcŻY凿[\8{..g}'_p_|lq9튎#9v1ղ#G_.gW>_ƿ=:=Y{|ݾ? {XJr~ӵ΃`$;ɩL-/jJ>Pz6<=GM#JϦQgBޥ㠬:pYt6jV~X\Kg :! 5P̡05<[!iC%T0" VF6V\~/sv]1`﹩||M17&<|H9ۺtDk܄g 3M,:1~!z ^ 0 |N&xQr51$?GWnm]'tBhx0t$h^Z}%Qel@%* RV/][^CGِArze lZ\aӵam8K&L-SG'v:⾃^ۡ5{=}Xbf;>8>b۶vX'bPϑ~EbWjd5`_Fرj4+1L?"Y/F<>5vQ91ld҃)dPҌҌBQÇu%*EQ3j|- iaLg!%}MeZ1 DKCEʐ xs)9_;6f⡇zLOkoẎ>$ ړ250n#ZտvdY%K]gzDǧ:ϚIAWZC,C9 0 ~Ns#)Q Bi%PJpq$.@r:B6SE3jApbf0mp^_c*tٱrbggNogǟWF|{ً}8-e Ltel$?+o@ qPdʚvW֒WvRRB2dVΤR=$ȕ$G.͈$ͤ$Nz*MZ 'ND`8dB UߤOʛN"B#"t/J?9{vcap)Σ4W%g]*2`K1"K6>db]`d]4Zu˓5ۢZYΑYPv*Gl= phqK1S^bIɺ) p@KGJ,3 '#KX`v4UA 8 bddU#Ɉlg5OI)Kl H}T}u}:^@uֵ տy$ّ} & wTJ94 O1 ~JSM. Im~C΄\XaNC0sʼnf:CK\\ιן ˤTXj7! 汮Gǿ:b| 7 pL''7ddPy=\z:~x9weU.A IAji5?"wY,Cu,Ǻ8N#Ԁ!Qt?De>O/q`ٳ" U9u*XKƵp /ϐo}6: §T_Cb8ȡED7bv^oyVF[ނ(OZ"8_Us>RFaژ;4{3}j/'V0s>bp>[ny28)דj+~[AȆV+ݘ\hhhV+[h!Ê 1/_'_Jce˱)Mio'5⼍`?Wΰ m. UP'~/;?:7ׯ5^߾2Β+'͇PBF08!NO}x $pVeօ"C!c{V ݮH?dhJᤡG; =iIC38nhc0jܱ?+waxPqWMiaM0D,(5i #4 ҤDWXI ]!\S+@I Qҕp?%d d jtB=HBW) S2.ct(e]-vBɃU-s`U=c]C)i֠+C ^]`%u2tpH&t(6ՓJ]`&D2tpӡ+@˩h:]!JZ:BY*+lH2t'BW-]#] % 6_P6 䢥+Km)24CD1]5Lͧnn )J,h:F04Ս7U-M!M+ 6L1\Kl*th?#(UKWHWZ]!`]!\̘7%kUJ ,1;++Dk$#+k &!B&CW$c G_{;]!v۲Y:p(~zd=꡴ SW]ZN_Tד%UwƮ ;w%zΪBFMU8fslTB !&WAbtQ9]ڋ\fW/-Jjx8 oeڜ Qx!m!\/Rr|dn21 'BӰ;ϥGvw/59o3Z3"zl?z~}ݷ/lВ< Y&8 RddL8S!˕ m!?J!vZ;/m|Tjem[(mzEq:}T&Rz΅qY `QʲHGHE Ol;K{rZYB&6#p p⩘؀qtQ*ҚGhbs-3 'BWоi롕DZtutOI0;zpe2 jtBִtut亹[d$1U1*etmn<^vi|p7L>B6LQ Q¥&B\6hc+ 4!c#\̘}]|oYK8=1X.?]=biy -]=5 ++h*th5k:]!ʦM1t$t Q U V$BڦBWtBLtut%Ta"!B&CWײT NWꖮ4Q4u>=!Zl\}%JCil`q_4pe2F054 (--M!M+ 4]`-e2tp Ib@tut)V6BFBW>vm؝%ogD J4# JgMat(yk #]Y&J,:]!\1h-m:]Jڴ-]@Wbˢ]¬ijZ,E}{?j6#\-lYEq,:cL3Gtp1/ ~ޮ܏reQev.͘{͢mYuu"-z@2/dXn>4 j]ZԜ铷U`q/Wǵ 4pcЏ%sǤE;Ǎc:Z+7\QhT3h:V4epf˛M't}3AmP˺T.eK"gjybSsBQ(Ro;(%F%^jwmfwwg̐_VgRonu6|{$w-ވݥeU-3v}!c\P6(ɥPK(PaU!l!5$ A%-(D/%NlDJd$1U1_TXS &>{psneRW_Ȧĥ^n2^~]Eo\,nH{ ٝ{K͕\/;HWI_\^/iS}c{3EX;elvҍGZ$l6oW<_ z{4Ӵi. ʙ9u7NM|݃ƙb 7y81WtaFˌ[ѼS#sî ;֡M6cFk瞫o擑> 7gFRO qC!xMϠdMvVTł(lUR RUՠlAC2 A*L ҔBō~X^jtusv$,W덀%_X,"lset9IQ3>iG\ \\LE8Eh);bZi6먌)q)ef D& T[r`w;oOn:7ㇴbHPծ=X)rڞDVUr2XҿWϩ1N@ ކ:AU@ƿ[3q[it*_VsY~Z|C­/][H $=t^+s ?- W9Q~x"[(n?Uo]R _ZQjSys0yy {{({]VjBc$;p,Nd/˔1B;yN1|iY}ioiqw2v+zIưRW)6ڼbB٬v!l#@QXo-)8Nf$^V.Q̛%¿j=l=-lcǡȖz\z܇Ɠe;=> ^8^?jdqU:yP*K_syGs:R}.ZX>hoj>7~C9PJb_K,J[p :iA`!dʚXSagMr$!Ǐ[W.b{y 1 ̃&ce.飼k#许}YKF J6Oy7m ~ nc7vjQүɸou)rM蔙8pr9![k-OكiOENiCd+S@ !1T7.qVl%8 tW;+j7vv+p%nij rcT tA84)Y[|}[L^/7?]%ےҲEFG@_K%]W;F{չZM G>5/QK+'ltԮWK\EbzJLs Is,L #*Fei|~-azIơN%l ozd=Wa݉h|}xvvtq[l8M:ΐ"aU)mKKTI5:+9^da$%mt3b#2ڄ`;Ej);-v8 9M:blgvWqU]v%MZyMrvJT{I\cw[93+(:pQuєQGQEL\,Cl.AFd3SP:[D7["nR*ћCk`Q寄 L,ۢQOC櫓@z+ظ8k T`[bgL`A=$7у:#CW..nL)MJ].vqăe@KN).AEl0N +/sL8cũn1{G0a;& -ȝW jo4# n~|G ]˥D -MV話`|1sH8o s݄$; /AB8 :T`+Vrd 5@m+>RA.J I1$:q_aӪ0ہWvyJI\6syx{ۺl_O唽W{jw6_}!Րp^ʢh99B, /`IK0Z TZnPd=zrR&9̝:)Tr55 )xQ8ɁqNR 3S2?1w ֚Mi)#~?eg3XD`-RK1Ll7CuѠ+D~r˻eaP rޙ'mkn[o%" -n0 WWc-esjp#Wpe BpJ\жEE48,e̡PM28z'7;9puamVpkS VBTm@( g@b"TiZFAL_^^iu\BL7VIFvnp&&˘~>r}wumXm7 C&n4tcv&N2=,}َ+FǖT*Kx?yO$ؑRշS [3k1Gvt' =18zUFr|w#DPc'݋4s>z< @N~$ tOɗ xzO\ljUXʫ*1Pp--6 ɇՂdVwۙO9W [Xf ̩9֗<:+Jt@4?'}6fQ:zO?}Wo޾'n߿o~',G8Nfk WEx_EeM}zwk>Vr]6ǕmF^~~5I?dpG]hY{OK ɣ `cOgOGqq_n"c/B,W +KO+PǮ=躚WD;{w#d8DbZᜋ˜ M8K/I)8DBċsz#=dÒvg&0ik8ފܺ=9Y3`j|2!nti"r!+&dH4 33ͳ]pߛMSmd &hU,"ϜE$t]{ūD"R<&ys h']LCغAͺ A7+% :2#OEl$bM"Eɩ@*>/8찾AHIFul۽9W~⧉Nߎ~1fqSHNe XqIW I(nt^_o[4>\:R~Fl~i͡;!|Ǎji\{zsk/$/Eƪ!r͇QAe=(#sMP5t;&۽:Fvo,nG>\: Wѧxt9iFiXﺧ\|ɺ<ZlBA+eOJ(LcRdcpQR4FksO$mO7894[!mqHWZYl>ɏgrW^떠>wG1W?AYs;`ϛksj&qeW r]4UeK璸q9eIz^:6@Kzjf}\"Gi2Sw5);eag{DBԍ̊7 _j{|A-X'∝8:y3<ϒ[cK*R(NۄHT2fn*0k Ǯ&P> sbEP],ZJ9Ykc! FG0tswifg7jMuO2폳ya3l}D<bBIib;cn )DޕFnK\E-<UU{ߪq|qv6+]ŭ>G:b~w\,)4FZk`Z05֊V]_k(دVmI]UN~o(tUu( +c J]0';p% ]UOW=]@*DWp6VBWNW{ztYDW+uDCVUECWzîIMX禫5y~h3սPc5{ЕC,i/gyրGS?Ij 8I} =~ 5_mq,?>ئbVG<ꌹ)B%OVnUguM Wՙ|ȳsr<ޣ2onV3V@񾖱PQͼSY0~bGr,;}b[O?ntVLWzyNYu|Y٦8} ٜ70B<;-XWJK-]T87>hCDZX4FnO9_RRzV ٮdJ,5(DkWq{=ն!v`B ײ@Cۇ%}ClC ]3·BW]g_"]I&L2CW0CV]rOW/ +K';1}$WSkq'o^oŴ5z@4 NtCN $MjO/5GUK +ӹ=]Ur*_"]V ر ]UEr{( +KZ]Uv8ub(tv~E{u>q$ـv|0tU2;hίV껡+aכLWsg{lǦ=] iO> wq5pv?#A"e WϊV]6{" ]i=$ZtՋb{)UYmڰjERF''`Mk{KBʟut:sl]LON΢ՌJu2-cTŸ4|(F Poui[%KU{Nyٗ%[o{{V+qc׆tWnv^~Y^[ k<`m 銸Gڌ0eO-s{LITگ/ڕISiu(a\H(0׽cf53,t϶iOOyGg.>oz6Ux*카F/WdV$(@:8U֓G֚Z"?_f HIP 뻁"O.r:5aůUfHY!4!(qŕ!$5 _N$aIwd"v{c?4"tLI\XA0>sho<2)^Ax'Z,1n `K"p$e&j$g ȗNmםhѴV8'} 9^N lb \hS#bF&^TdR"ԃNe>3VØi Oy[ B X8k,*j!RR$,[Wbim&t߿MQeJ-\Duנ U0 &$&`{N!tU=#*hkCh*%|*Ś]1^:k3K5k94DlpG#rc1'HڧcWYc}$@MKd@.X Aoc9Ę(ZyQɰJ[S.CE EG|`fZC肷 [qԄ- ӫX(QeIAV:զDa$4ypLUp:.d BJ W4:KB{Yg60#`2փ;2eY oR գ!1T4hU&xsZ c5}-h'ZJ sT`,j@&`8"Lpp R, |d-!!YSJ25I$Hbc3sg _E/KPB]N(؁ЁoTzCes CIq2F0u[UE`H}@ "IY*r)[ [C c<{`h&f]1HM(41I0Δ2i%@ n Pz$\! !u Vr Ȉea*3(Ơ(@pq`ըyז̀!d~BXGAPS.T]Wl ԥU޽.RDSIzb `#/d\=+פ퐐h_F\N`8,\;@H sm6>^GD-6gc0-dgZm El-ЮZ1.YcB_6H~d/dk{?ut\ӤH9{jmGd1',!<]uOb֙ऑQŘ@QE$% !pBE7vnl;bY/PԂ۬Ӯdx=<[t>H6]`-x&^@uKFd B)hZ.'W4-*#L6LyLACL׃ŗ+X-96! HE&bZ1*P>xmBL0tzcSQ&h}V%A&[ix3R!A+.Uk&t"GW>A[YQmE5Yr6j60)Siݍ޿Ox># d}h ]`j}Ċ6!AX(;dҧ6 Gz:EjC*stPK|lC 0 VwC])u(C{XRrQ¶Hns)5H[;@@]"鵰T(d}i^PkhŌ]m`wyp!(Lg@ @$B2#K`ص,$̔dȂJ[cBʀDjw7qVp(yXT*5 Hc sUZ#:b/'r,dlͷ]6_?zXI+{Y g& h%A r[v Ԕb:i[FT^wׂEj ^݊%0lըw%L0A7lRhh\{=Xyzyۜ:/&M}iߦ]L泺\GIt{сwHlgNN4r$tll$iPvuV kʊrBq$BVyCO` QPj#Zg@ =peRt_HO[ n Q%pk0jJ%mCiK̤Q[d~5pQix=XoCn6JjWKC$p0rmFtljdP"i!:]O](o$Dip(m*?t+Δ<op0j_o?̺^v>_{PiPɴA &#de3]m9{ӧ;'*(Jcr r.Xwlg7M~R< zvN)/UN7?zLJ-'˷9PWMO-X.3r4QL oXPKbbҽ=w}-GIIF޵2OF 5$?z8J ~0JHsH7Y tJ 9@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J#Viԅ^H )(CQZ] D(?(@vX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@:^%6F@x@" F *b\}ZAF̄PBZCxvH, F^Jhhk?K-oDW؋ j(tFC+B)9,l. 1a\*~6?A~r >g>#u~)`_N쟖u~o>6y9#'g'5 qRtI*+A5_AssVv[˞L&o6_y&mՋW%=Qtя0W:z_9/9WeMV_q Z->}O;}\O7X֮vmV~~~V*a0I*!Zm=I%; OJOa@tF&é mԇNW@ b:BJL]b0tEp ]xw,2$LWCWA9#t;9"a0 ZqA2LWGHWQE= l ]\BWC+B^:LV\w9mJc>'OkP'}Ӎ28G?z0tQOW#+cLtC+cf0tEp ]RsO ]y;܇0~lX(Jh{ ?4Mph>[}4M(`>BvVx&`[e]ݢ'q0tV|L(d:N;GOU,f'E-t ě=\Yu8ųyY}x?N_t֕iKR.J:/\?-ǤW&b'/59_WF_z^|vOnIUtvWl·ZPBbhI!$E+I/bFrfKk>~2N?ƒ[?~;?AXz;uwuC$ǷG'OVfľ`| =vqwoft!}ϯQ;owyu6r>;ٳ_K/ljH(ZfOwUW-m)dt1o+Er:}yhAL cN?]eݾGnOG'[SY~4_-Gg @\MJϳrmqPu^I3.ؒmN* iZJm B|"Jkv;[WmLoii>.lҢ6>9 .ER֍kzeU*%6g4X26ZmuҮw\[|]Qм{ҎmϦ/^l oA/-V\|GlR? /nyuW׮ ?!ܶaJH&AHȯؼݔŎ'Vn֧$L{-Km? JTa0N#QJ1SU e-d`N".jg*`!wl|c]ʻ6@ʛ;I G  PSwW W }FOY^̧Sv6|'s'k'm@sVxFcU黪A̟tGźNe:CN)QQ.\5EDVˠCk;+SdGگ#::K{mlA}5yLirAI:-22PW1ELmvk#dJknl_P7kW7WUzraZkS~I<҈Qx}g0Cʘֽy zm~jzQ~aFp+3si>q?.9Gzća'w_1zZ^w{b|}OMݐnHs7ˋ#QO0i.>ng]._Wj할ͽkXUxn jf2:|MrUzKj^e&D?+ba Xut~H??;w?}{3Lywf`l G,'POa^]kV߬k.]6G]O~m߻b㭏[n JO{_}~7_K7oߏ. n3Aok.6 WW\}s+\Geܥ tQH3>G9I8_PperVu}A틍*xj#H1n(wRRTkؖ9'h`=lv6T^Cr8B}撌~ǚaHjY ߩi8/|Pm93k L`̃ݨ\!e=Y֘НrS7G[3&V hg/?7m39mlO9٩$s 8hzG#]%FyRv:POEKT~m X rnkhUZo@!;quht4#}0Ѽ{D{H~=[0G'Z2|D=_aT&BAL 崭gG,kayi-օS%f!U'Ѐ៟`c^l?AvBR6J!^ؠLaL )&q< \kćG1e5G*:BBѓx$O o,% e Ej8Sh@+y|t &EtMKҸ bNőJRZz,1 L0RC5iY~v' zEsI) qJF(Pc_9#vϡ #z!U@D`SSFDD yI0<)c"-1ub+Ɠ7>a>O5wvnVO/jSyώn5?(sl`8F3 RP!mx]z4ƜYE~rQ*NcF1`b(Us`69NPB2-Kpkl8 d4ƦW+_zl)E,,K|SwU0->{MڸZڂ. w0&;MŦ.^}Jz Q*{ $TҰ@JP>L.ddi] vH념grZ- +zj7'v~-. KՁ +(&O쏧iC, 35& E 0v QEla% ,ӝE>WN'}Nl9YSژ饧R}NVJeXä`](WK\wp7iVx?DKgm! q;K,5Ʌj?wst=,`}h1a($>U f-!woB `be[Q~V*42<")BrPY-Z"4(H9#f Na.[QVOs5@93_?_tQCdFabɴL:R%lB@(xU!ƒE*L<阜)1I ADpC@F,s}ӝmC5v u׊֨ s%LǍrhKz=xP'`]Zth5Z6KUw[:lW)ZAKeU !*`#;€[Tn]YQVP ]YT"ã H)KpFd,QaZŢE.E+PÞґ+++:צ]éˀ4T 5R۬d>p0Y &#6.DNfD"45s:naA:>zR5VE#FyIo- Hk^eA#,2Y_rf~J\>?'|b|8#TېbS`˪L1J$0z9^I\Go/ zZz%I7*K@>J]9=#u*Kѧ>CJrڧ!\iC4<Pf+(ɌӑL-1 \(yV0Y'' j@+`۴%|Hd~6ۤ V~sD|<6*gZ!xa=򶜆$ۏp}s #%YjLnd$bl鵙tIᯋɪlNޅOf0r y͑⃾< ,eQn{opفR!]Lotb̻yo/d->{4 G9鮡 uEgQ.79nCvWy_0W?Q:T 1Y )g&|_o\E؝W%4{ ub#W&H!BE$Xڜ\U5k?잃 >?zS,` N˞Si W>*A¿{i-×Rh( ʭuStL& ŀ`h뙰$vs{ONfW^z% yw'xtȭuOo(+^ۭ-ϭx$TZFqZ${\<ES<(dz粞N,N9y*8A9@( "(`yg%*RJ*ĸ4O[Wu~'ڿsS/9^ăqу(LpcX#`$ҁlN:IA^pp0BuZ4 6ڶv;ݙl /ē׍u[o4RQ,&㙢Ҩ-5b ) bԚ8)O:5p5ӣ"W{|R&ű=̞#T¿xE $GĹ`.Y)"rp6bJ9Q6XxR8"VTLsɇDc%'!V^"gjLiE0 _sq6 ;5>ʥzvf聫lHfdjPpjJT붙Fa50Jѵ4&/w!/N&e.Ժ.R)JLh&?d`D[89.dՏqo\f7IIQ}}Xc ȓL= ~bUarʦbnjzZד(`Pe .A/ILg k/Ftrٚ_K ~~sS:[u#Y ?{H}N6 ],XCKL"r,O"Y(rhvINTUWUW'IfWZ.Gˑwᢚ4`$7+Td*{1u69g#gT?ov& R$[e߽p MKl?T#Ni"\넮'>}U.~\MZmp^x`:EȆ ڪZg}Ok6ik=r7[V-nd6g"qή1/xbpTKPAk) tb)I)(¼A# "-:FDK,>3TXL'!Yr;ZI_w/fcjEdu!CN"?PfcD%W׉y5U_0w;a^/F {ۅU/JU m/]?fk&y9-_-xnջZ7,Z]-P47՞yt}>/XU}3gvVOH'jv&dwQV`ȍMwWEӰ=^u,ڦ5.+TB*^Y9\0M:T#L])zrEip:Ev~x|+޽@` V?M[-mRyJ/m.8Z=Qa"Z2dhp-.(*z]䅳s9q--\mݕ!D\) 7sNVdqui59RP@U|CLIa]C]v|?lW7Eꮶ -|gݝ6Y\+HH= aIT8"pFZID%nh6:AI kU.*v^X"%)C$E'5>h4Ady  d4 }T_8 yxG҄`" N(z:qpmw0"x5JS(C$t go0AHRFpBՄ%DWn)фGeP`,%h񈤠q(*tWO3Nǂc$*qRI.P3o9 &!61:]j|TErU3F5"c}1T`)=´$jՆaH%tTP݂6{ѧJEz$cS9ehЍMV> *g Z&%<xK9haǯ` $)8P_L%s*E ~X wn/n@{~ "3Jd5J{)ܾ\ ޏ:ɜt&DI=\ZjtMv=iTVDvT`fTw-WZ]|7g*5'hJҐե TzTh~@Sh(1ok}H4dOW=E+w={[طſпd;R%ߴ-좋aFu&}V|o;plHATD_ yeUI?||wyGy<|>4a[ƣh%ƄH`@ 7o'e2eq&NF LX@SP4]R!24Չ9Yaec3r ?dQ$U>_F!06zd!tʨ Z)hʳGb0n ' 7{.l{G;w[4iK)MT+?"Fd{sD-.mR8jVWLEW>mԩfTF牑 U1YϢhRNCTTqG.Z3vFy*ݵ3vՅ'-a.ܾU:[ž:xffo.h4m4?sg@3넮&<E>9_RT8e^vkrVNCv\DBge:"(ᕈ٨vĄSD6ky\c8}ڝqG_(ZXq|B' gA:m^{"<#Z:6!Ho4gMcٍ۽$vIzhR^tQD A1`&(4QGc A[vFvY1E#vjDٱFE# #9[=8C$bGT6Gս^*\HcPBڤ4Q tQ@Q=Ci*"=g >:;]X/^|)6g`)FQI/M c+&KD`!@#`E4Cžθ/PAi5h]׺KQ 4 řĄ"GpQaKnD䘍xg<#a5`UqQ8!|%F9ᔕ@T[a1^}.{&B< ȵxsqߗspcdx=8QY1pHo6v~gMFCmh2 yPG{=1НNbWxj{IBQͧWs1W<\+m~Y5_ fë+b-!?[oNdK;*"U"[ym{k0{eC[~p._Z=U`$VVKQc`pr8QϮ0~8V-$c±ՒMWBD*ι.*w[r)&ރMtSa :LA)0A)0ER :LA(0tSa :(ktSa :LA)0trNZqA)tSa :LA)0ÔtSa :LA)0hSa :LA)tSa^,0tSa :L߳$~,0tSa :LA)0^^Uv(hY(=I;jEgrF'>0 GSl~ts<?y Dջ0qb uT>H}ENZիUD=TQF- |4;[]C.cK71_qx3g+3*?3Wsh ?!_3o__߷^fcKb.q 7sjnׂ#duV~Tau;tiyyufyBa<`żOƣDVc/AǼwTFuFԢ[:Q|XHX"|7HZ3L zL&??&T!Opܜ{d?::7o?~}oߞQf^7_'9\7&ᗽ0}C7s[ ͍fhS.yøwJvfa׿= gMwE>\7ZVfy A\|F1lᆳAhoQS]6+ V!l/5`}My}5Srȭē|$ۖT3dGm834$zO,Y WG Lh΋ķ9jES8"hUR1I >w@fu3U_; 6~W$}vv>XC/DnU%mv*m',OY cĻ/&Ә|Ķq/?O|Ox䍛N8C<#\N7{@EOtE.~*؏\A>7Q+< Tvjyn!# <Yls9z}զ"P zp&sIh{46[3ijs)wZ1~8̣?rr@%*) tra3ʃqWB #AsJL=zfУW9U"$'sqI% Ѣ7 y6D"LΧ5g︯.ECi 'K,h`޵>7rOųxj+w8Wr+|9{˅DD$%y~S)VK`u/ s F8PNE!kPY!ͱv,gN~|:㾟u 8۫|Sl\+p씹R\ 11^8{B saID<-"E=19uJjݯg3&w=(z?62f+lșzE]wJ6 +at  %&"锨 P) FP1h<Y?\Bs2R^8/^AoygB=]4.r)umӫ?A:fx6@ƅfonm56z>ʆ ŚϤg3wV|& dbt4 9K/+V˶v8;2P`RnFOcgb;(QB[Ri$U{.Q+0pa1y,!²șV:A0j8`:2hXN?X ez06pa2$9O5Zks 2R[#~C JbEB% Qҷ8)s[٦›.3$%D^rY|! !xWkBiK 3 c QK h,QW^g?~h!D*pӬ$+5 %4hWJS-we9RrО3DTmB2 p&eFˢe=k1-m~/Fo_ ēt4Dܝv9ygWڛVh؇^O/>_oZC%PqL40jH\f/EUNFOv;Pɴ"9P*Q6G`I."3H9^Mmjh .S+,j1yU>*gtU7SEH*KYòDc8͓#>{kI♭91IhUKӮᔑ! D.ܜ-Qb)\a9<\(3E7`uni*lr-!4Ε"5!R:(M AW%-[ ե=͂ǂkl@N+Jؤ@$ <'- IDK`ve58[W6aJ􌱈o<.`rϠ>=/njb{5>3a;s;?TZ[/ ϣ|:|!%G8sYol4F5}5ӭ}؈t[I2풭5PBsE薩_?Q<3T TxnK86FmppQckckJ,Ǵ[ %xsIheB+ADd`"£2QiBڀ2##Ay(R=xb?PY\֛ێ<ޏ:'vq7){ =UHA 0T&d{1CTD߮{k6>$;VC{f @!p·PL8IJP1.pwݼ'\'6K|ٔ_<֧J-"cduBLP:XELZtN{ ̦/7*hس6CC PԶfw8sٝȖRBoY9T7J14L~,)T'%i;_Dvk1.C$ͪȖX. ǑS1LH)-#Tjg).ѥ7]ǟ#ŕFo\m87RHz9W/-F&׃Zi:{Zrv\<Zlּ{&Ù4,N:;D>a؟O*6c}i?4ӮeIDal8ἡ+ln}.RVQ R( s1GDT5TuVLw ހ9\8JxiH;іq˪#RAnaOb_(fZ10 w3 @o4o#jaB\H(v68Mt Ī.jzOu2S*l.2y+½2|-αժd,|<XK&4Z ]ߕ vwmq09R`a OQ {E8b<*<[͒9h]J`˝t14oPLKMCH\IM ?T 2B*[WJhrx^ww~odǓ̖X|./r?Rt~)y#EP ѓDkmH_?v~TwW 09'Y 8Q-qM2I~>Dɢ$K#qdqf8]UGWA )VU#ɀZ5vȣ@$XhLYLJv97ǣl9n4{0R2Lu9E-5zkS[fBJc`Je jTgHW";$eDc1iژAR"kp@G bQkC62VyY:ȹe§┢W2kE^K1M_9FQِ<:@|In< JnC|\X4&8Ib IJGQETcC"RЕ udBF-*WhflzXܒClSvǦn]E~oݚ m kA}$H9( eUbBY\$HUjPVP7ŕ:dʣͻoc *4ؠDBcP#orA̭ e{E.dr -0ELMRdm Xѻ &8*.aʦ66^wSoe|ThE밄jZ8_!eO{L<^$@iUi󶠷F`l6fo TlZJQ_,]_2[>M/Rbcǫ'M [2W ֮;i1($㭇*l yèZfoͱ ii`l@;CQ I;S;vpgR,+Md(IHPmgEF'W&h6Na# DOlsdB^T wEe=u&lǰ3rnVdH ab`b%2fqƨ$+Ϥ[?%'u5D:eT61وb`OAF2JAd&qJ49:'0ųs3.:4S4%C?+!;['RdM,+o𚜷e+oycMNmjio1Z?=T* F^ڹ,.伶xi2$PmɫJb S:]Q 0-Jи\|T9PR 8VZ4 %cg"M3]ܮV]e!H^޾yͺ"-nxs",G퓿 h8%N4$c@]aSS2yDX+![,QOC5BfBbTZU2"YS>¶@_xFNH;؝sĎq01wEjw:vEjC/{{݊/E*kX&( *Ld>: 6v6v[u!E4Bfd(ɱ$U0Y,@,[BXvF5W*wE"v>*Mb肨,)8-j/-:6S%ۂQ amL:6YYy-iN'!N9f9&~Ցlղq|Eױ\\u+^*kN^mFrVQֲx WCFŇ";["/z+:FomE\Dُ-vتVGcݤмU`%'bu61g۸J};ww4;hv"g.(: "082ZfgbG řC5| g50J1K&I$oJ|0|h.]ȹE\ޥ;Fi'+̶<˕IzSBO kVL͗hGb&U%tWOB:fUܿ.?ر`l٦B#EHso{nQ;cX0VPQ҂qdETњYJF:oRfQ`U*t>19]}28{|OwK">}aeFtSnxq^DŽ}W1sȬGV$e͉հ] ]Kn{iJ-`-y)A3)`%5X0),$SL\:3rn+qWd^U\b6!o \mS,""޲|"R;Q.=WWy;ɓ0 Ȧ`oyZXuW1)`D=yԎ9+KPOV RVʲHmB!%+d0Y_3>#YKӾcƅ[^gt>BAveZZ,OhտhUU2$!Q^-&e\bF+E^? aD=z: oSo4B}mWkyP#,/@K\G{^obJoXO׽sh^.>zo=#?!/N X^1fOZ3Xgjxq5ѳ1'uQ~Cnu۳JϢo=:i#u`4[h0)*6LM榖7ixzO,xW_~ëC8|;8Sbwo~x>//~^u54Z#fh~(LelôV-izهl&pӪx@˹\dR NWlfEߟlh2U9܊Tkb#Ax) s{m;~-&.TǺ>&>F;[ Z:$uQ{v.E.38FP1**m'UΆkZc+}-:ƋᷕPY%dXe3[tf4?;ݕܝ;nۭwm}_mu/ª>miv=z'uc楃<BQaHhrZ~?HߟSc*Խ.IbFX*S"kzbq jV#J+^“A$2EuN MQJn9ػ{I }%mnʏ\jӝe ͧ4ySF{]HgQ9БeKdD+p,^-4h"7G=%BP(4$*A&(S뇩{t=jXB]al~8?[\ۺ^o7We2_O_`Kl?# vYp#%mg-  xI: K^Hkt[!&\H¬3/ߏi)l[)iPS8xc|]ՙY'dj4{o${/oƓ-J#OWs fԷwи^ՕwrJmbu҅< rHvc}XܫY7wS9pQ[/ bӶn{DP MP/g}5D9NP ©Q79&`ЦQd }ɮV?AJ6!LP \@%}R@RHLEJYu0H I@Ycbu@($%#b**F+Fձ#3rnvn.H诤I]5uϣO|/{G1b%̛hSS74gh)B3euVSeg@w>w 8,ճ?/ ha z%,+sWqҋl w1iJsڢ Kiw\_o?o`q_jՋZ,+t<[`◟g*7e x5>NΦkROwú;xJ=|t^ #4m`46Xٙ@ -BjD&},BX3oejCgJ9_A̧]`*l ҄Nw8 99ߝ$l]D+B QxuʃSιa >s*{ӆ`QJ wY 7vюW XOn|ĕwz[{r}5hj iTzZIrz _s7M6jtk1bд SKqaF QڍSxeE>ʊ|@0*+2MN< -zE,89(k" y! u4%ixIv>%Y]fϯۇՃ)EzUi@;*)jxt2GՆ JSDZa@KFDML\ND\f)H3 3%=TY̋oE78Oo&i7qwɔx>^{ Í 3\AXuc?Q3R-'E7VyozS *L@OrV D'~,~SF9 '{7K8u"xFK;PY 3Ȟ,hh")Y gfiWfŦcd$p*`HW9FiQppy.K``3Io 4,ypsB&St&F AGyq s`.^g.IdRJN½rS$8n,,9NJM{4ΓAҳց|T\ 'T꼠Ӭ4J2+Pʁ[cſ^?t@j7@4j#0'Ü 4!в=$%JDu98Ͻ鴤**]eU09,N%o#Nar,{U9FU>Ut%\弰0 G(RI7z>_"3Ӈ.URE+9rλr-&kFaiL6[zX%^M)` w-R'QJ}HZ&Y@K^j#)n"jS_༷TDY:S^X7D<=ۨH[2N焙&( ρ76:J< Wk6nD4UHz̟MɏWa<8CuDb~p`/hatbq`\D2P*)[4!|g(5Bq lpJ(7ԫdR Z{xs@Ԝx_+h3Z-.jz9Z{g,cW]|}:Ȳz--\[|]\+O_m\j(k˦6!mm Qhl⮺p<4v!ƶ90ޮvռx0y ?>ufoVjݾdž^7/?xe||=gG[>}N۬y oh 7ݚ7s\v*Q7jXŒq)e4aFK۞,mF+n_+"/LS% x`G]M%|ҙKzyv$`/ڼu뜱*PogT?D,=Z-@}:K2R,i/4阣!H RaK=&fCeb1iӏF.!Pc39y1q}QYcx/TZtP!8)2я 0w7pJ ɇ[Xg܆͈\@iQ^x4\HքLV$fGZ#5NBy$H<l~y'H=32ϛc)S4)0 *zOD0HEd#j#$8 c^mqdҨ8uzǓ~4#=3G >U. @*8@`R'=&1<`0ެE {v[@u.>ۃI^q/o~go]xCՐkã@M. 3zrbflvI1mZ;BX(Xn wg.g1?]ӣ,[S+F@5Zk#HMNfcljnݪ%&?i]nP:]Mg7i^٭4([y'1DRlMMZ&7-CCIX?^..j?%}!)q?nxr6O_Ыv~?fyVy2I޸/-6V_ֶހZ즅nAijLjO'?7jC. s/~ނF$4rǧ[t={AtȻ#ٝc@V ]w}N~\v]lݾcHm2xdmEת1x'n ‹f1Nd[F-v5hy 瀊- GT v4831HJ"9 ҂oDBzbR_ wH=ߡ.zp8vcO*oՊR(p%t"1E42#"()JJpHzOEJ3Sw=˅窂vcOc3m 'f3گq_RćɻJa6 ʍɸ2֓_xW?^e3?+s`~MW2ݟ^V2e(͚EL#8}Vٷo( ~MT  M> C;}vw[3ȫafig7yဪakկGׂƤi}tˉZ>\ԏ;Wbk:|wRq5wأ&fMSl&VcNl b>^/ݙzavWTG2MZ3و><(RIJH+k,!&Ea7IUSuD3<&wSP/² R -=wûo_[T^Ƌ/^75s9D)c3(#n S >LRr ڽx { W[g{L-.}Z\Ua0|N9a{tK~-.rK7B;X\hqySCV<;oi,4ٯ()֮p@#kLf,=5/+PJ;"ݸo&OgSrkYˊzy ~+;󹈂rhLA's~"\AJ9?4C??&y'F%#`+Ĕ@WV#]"] B,Ib U ]!Z=x\;F: ,j7ƾ5 SPW R=|f< MA_N"O$?!| f2]6Fs;;!DN(b $ .0ռF넖Q=tFr!"K+Vr[ ]!#] ]vAtu9:0}Eth%C+D&tu8te$0B`;tBBtute!$V?"\]#W%l7CWl˩gGn`! ڳ2 ݓJIհu+6s2΅, BBWVȡT|C+CZlnK_,fiK}QՌYe"!mu?w/{3w:-.MF§ 8MկѲС3Eм`kdNHGDED *3|d^4҈q-wjfWBL*ι. [BԚվhcR%JDu98/Yy+v'+vӴZ=Pb$VVKQc2/cJ RF,FcAzF(5h80V؋ջnd#] ] ΤVCWWc/F +-脚(ӜK榀-Ь$FF 'HӇCӊO ++Q (tD b 2Z ]!Z1x QW IWF׽{)]!`ˊ+kH1evt(HWHWVH鏮$(R Bt( [N=?"P`FmjU7rOn(8x#]=wdF [%+kD)tE1.:]!JiF:D2(>rR|ˠD1d夶_LO(RODk/J;4cx~ɥe$qK[u*EJ+@uU b+ BVCWRZ ]!ZNNWRtutj%b|B 5'Q, ^}hkU M#\S̝ ء4dti @ŃNՃ!J#G:@Қ( ++ԥ5Gv(:D2kw +EVZ^V̖9+./Fъ2vJl9\N wnp/]uCdj LJtܩbkXǤSg -ڌ']>]fRǙ-NbR tZjZ޸z)ermA L \Jh t ШѼFÌRpX[Y ]\dЊ`{t*t-RDWrCWv_鈺Z^Sgp4uBɑXЭ6ho˭5Ѵr`l)4hiPR*G>@tlvؐb ZS ]Iɮ /+D97#] ]iϘ(0eHWv_5໡!ҕ$%V(w:;ջྜ嚀=Е%*.G#]ZC Q]-^bݿٻ޸rWEGY" a!x +[$ϮaK53ھq&<[݇d9{C?y?@@e<2=J&]}Y:)xCtVj=#PCyl@t}*=y",:ٵKf1Dd OqɟW'B{(m#*+pZta;ruj8銭 ^7DWQNWb@{cC)n)ҕ챖Q^Bv"G.zx[,g~+M0 M?CizhP#+?k$Dc7DWfj q+tE cIWHWx~'W,VΣ, ڵTҠ#?'k}9:'kQ~5Yy{p?D+̔G>oY av￟͏o Odw<=yayD}* (?1_- 8uQ7~E|5L+5#@==,wS-!v80Ǒ֑-~B_[FXKeMVCJQQ[S+%#Bb !;'f%*YBh*/|"OnռX\).![ӍoFS.x;MɁ1f"S]k ],)";D{j ޵6d#oG0Lc&)eoh@ZE6`5fg0ؽZؠA[T< wԡkˊ#hJV**>:_@2hCgG3cMm̭n\%/DIU`(.t-c64aG{uU?s%p Z( We]ֈ")ٰ֞)`Uh[% Z;wMBAqV*SO%\b &؎~okҥ:"RX dlc!}MhHpu ) d> 8׀T %.=8 MB%g d,ܺ޵X ܁*#7CAwVrE7(cºo%a iMNIN("2%+4v&߹{f)!n3|kVAx`1g#~ n1!6Ks0k)q$88)ԙp5%@ ePfb'!\0+B4xo;E4GRF[ WPP^g(;"J` ;lBrK]Tj"usj{UQR@}j4=ar4[6DrRE(X(g e1!߮iF8YCk,d3W_r3/w>bF\jX14p U865eH;L'%}?LG<cgrݜerl,ŏ*ׂYz.0A@6=Dfx:p2ePtd)Y!J@hW!ˈc*,OHv9A|`a3{=h5'HiD.0iym,}P![n 32]\ˊ1ZzOy L5)/(9n][ q;2pԆ>NWUS Yߚ^my'U¶MRFNE/Km ??ͼ ȓ$rS,ی7XktdDHc c.ISr yρ 7D2Z}i̡c8#lϮKD0ePV!nKԔhB;ڱ",|tx E h QVՌ#{u-i,ƶwwOv|ӏ\O/&&^<}iusWs7mOX{~JWh㮭ov{?r'uIY&[w \PUVJp3ZGIڬ:J ;f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJӭR%s3wCGH}%P9ާX c'@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+f%Ь@hVJY 4+NyKg0f*cZ{8_U- waqM%k㢟aTHJsÇh[/R#sl۠U=uzR@w%PAiLUH+4X@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U T@U tJ C,ć".fv8J `0JvȔ"U tJ \˪JJJJJJJJJJJJJJJJJJJJJJJQ}YZ ɫ_&Դ-O/wPCJq? "퐄KpK7pPJ]K \)VDW ] BWY߸VPϧn6zSW*m>~0cf5.Lv~ۘIYʍ 6^{Ky7}'Wz]'v=e3qx}n]URl@Wվ]ϥ DWXwilE^~9NlK<԰>o[M4Pڭg뫆 :†C|Na|ۿڨrz_w֫~k& (E2KNiHG$,s^_R{'tf7bQ{z嶻^^}F6 zwWS{[L9MٙA,$YʌC`"@TE!&)8hœ9$؜kʌrU7'N7gpܨS֐S _89]F8 $oEjƁ`H9ZUǾ}zJ<%hyL "&"YLQĨCDad4OBD:1Px t- ͚BPhpŢf&$4 '/zT[vZh618%ro Po!-v2 QsOqԖ#N'8,MQѵXIkL[3r؊Ju[8Zdbj:J ~}'&m8}S=rZOkӾ'ir@Y/)a$Gy\ `l@0Jzd)ɝ2AȑϳRoc\5l.EY9Hp#ZL$53v u*qag/PV.|Q. *m[\ivʎyXG78^K{CTS&Z6d>;_Z?6 e%RC${.6#QިFȔ1Y(EN!e;f<~:[)f_X3 k5}38Kڑmk#P{Ly@;`#HjPCn>ڀ 5W!SdeA& #`(p$ Ǭ# A&pc> |&ˆǾ;fD]21J)@`UL2UWlQM12uF':eD1r㤕62dxGP% '#?LZ( E1#v ,/Λsi3+ٗc^4+/>d\-U$2XNQ4k0d'ʊrYVER E*>:‡m \ElGg2{ emjなh]ht0\kPRvuݯ2ޮȗ9pgP\;(FB$1\Pr"*=/V+=*ֽ(϶Q#,嵐,3(ergPƪl"r2`]k섍KCedM^HQ9)fhrjR42ww|}jbn\v{˕iPƅ>}qIvBfplOp;kLZh)K(o~h:TJF!+k1 tEWN} dVNu~ɨcЈ:X"'c zH k.Ș7IKqj] `7lZ:=O칦Ag5/3$]"1,FeԴIқ y4P pID:0>{aE$m&& -sPT )h.ݖ%O!P[znW\.Knd]ng};\!̭N(YLI¤*`)rIZe]5NGv]JJq.ɝU'ĬB6StΚ;[[GK-GsKԗPf"Hkm#G_EM4e3; ,,p3,iԲpݭemN)fU"P3&g:/Nr0!ܡkHkelq|ޫǓ_|KgigMe#؉q[q)bOX po0'qpa8h8xe)B^_:pWQ@.ǡr,̗ͧjTA#;sh@ٲntsTQ^ő)g$rUG'$" Ȩq+B]Z"7 Ye󺎗<]'6Д2}7Hբzm^WIE|TZᐋNVobs"lV0*ύqu:ؘSzP7Ϧ52Q~}:0B&Z,_4_k fkƑx2* {~9!-#o#q}˶aX0}6Oy)>n ]^&uTFvdۨmsF՞"ըen&tU4b62Nѩ=Sp\?P}x?ߝo?~3~wxW`= G$&_E ??t]qР>C.g,j"r˸ي[)Cpѣณ'+rDj T0 In|"Ig5m"Rj~QiμgGvݙr.)cIT@τR0 V9=!ؘ 4C'){CErY#TrKmZ\Dg:EKr&L<麊Sg[qd2OmgzqWJ:E]5˗,Uד>kkmӇ>SUmOV|'З˸:XI?K0ġ$W&\kƆT0p38W Mmn^?]z NuO;=ȩe۽Ϻn5_h4;8v#a+/r{twXYK|4@bW8XAtMbstst³tw\k93u$~Rr29UXo-W\rݖBo:/Jܙfw< qn?{BЇѝCXlRӑBqDtrk*hz7oG[M? +E|~mV[Z^׋|.WPe[I5)sѹWFIA5g4ɜuT҇UL$Аf"P20),CqT9*xDY9}׊^gKդmP`k2 Ϗ[x=^"NvQj:K 鮜 aj(OU5!qB@%zjpE \88}YT+DkhvjPְ,0|NFxfkNLZ>wweUҴ]w8u!@9rȕP0! TY,K1 e`rN8_ec A%Ɛҹ&DJ !Q*e 9ea2  : 2%(%I !HNxNZ IDK`̂qULsX"=c,/=O2:NIIDJ g0B=D!}L Tʍb:eUr,mC+FtD<^0'"SN^"9t:1)AsEb]aךY V%o#VD0>L0k~qI% G,.(! A*E$gEz J!UO&:[pPNK}C5'ʭ#iyNyE2窴(w-P# .N$ ŢST6F)P6451h\9C9%QBB,B<ڨ9{]T)(cPD-s8@K6R(eM0r0TeNiNH?rB%dzs:xUa܂d:C'^)+W츜s.TgWyYW q1$m!*Z_Ə!rrAWQ}-`Mx:hN.6Xa{D02A|P%߶=|c|SS:N"Y˔8*=?at5 lApAEE5ˑQxPVYw(Wս9Bd2"Ed8 |cy,gp56C!ƻZA=E-YwQZ7YVWOr![y7rlﯼ)2˕ZdKեڳ5o]D$ Vwffw׹Om<Ŀ>e=0[غn]ԭww9.tfhvf!,eݵwz^5&=ܽrOw ϞGUJ_&ʛzrk.h5{KASxYȊKK7%kxdy:V.*p0>?_ϕhgbvq"?~|d^7 x;8#! ͮ!|G=W(oI `t W*6G8P+*%9BUyd_$E*iBǸb vS?]p.;7.^v*ف-l(*5U-x mL;}ZGmtT1GjhGLR3o+EϺ%ϛ֧J-"Ǹ %2C`1Gji^J+:qAx9uZ }D=u .<1'#ԅ 4>h܃:x˪!( ;M6p T x@A捦N("zOtLz y9ZP#qx\=(%?ַ?Vq&+'$"9Bv;+r!t\BK D"BkMhP թxJR"0ˈJ \&>("J/sO_Y}1س2D%}D ۅU'\bCE6 s$"섪e;|\RR:_P VS  +$q2*ԩL齺RA]AuRW`8*k̩+2|ޢ/K Ng-dMYza@O]I%qn9{;{rEU"XdXǻ?[I|NZN #tk% ɠ\HA8E#NX* IBRx =꽼4?+f(m `m_HFɥJ Bd z|1hR>7Kn˛Y6[*STt1Qw= Ywm5Dd=c5! BK*l(0λna;Pꝑx#IGI %XZ䈧Z3h( I$NrK@OS#e4qap,8eT!2ir)M [żצ=2J \yfJ.|3?؁7WV\n-b OF THϵJPZ(e[|R$iBIjr[Bh )h0p D)ĥ ^@b . :ԒZ(A;eQR3$P9;!=M+~LhcѾ .Ǖ$44nq5~b{manߋӷӻnV:e`rupi]w޺ɒMOU֎C/ nmBFfW(uA5٭nw@q_?Ƕ]xj.,~坳Mֹ]v>yqݜF^r3?L67?C6G[fB7_2p;d>: hp7dܳݑ5/ ?f0^R8i;RMKxP(=w(.۳\>Y\̠)[7n'~ҕN~g_xJ'*T*Gv"RC)(4l/vLBH@ed!jtF8FȼA ( %Zxf Qȥs>eU֧pJX-cduBJܡv("4rRZ'ZX2Qp6d.v)>Y^*ij=Oyg\?AƟJ.=FrKe$_8 U#1"_ #6^:Ge֣y16 խ=exevkRv#q k4r!C09Dx}5yvuJV{k )vlqwb笾ccw /Όbj,ZӬFf17"i'9g6;Fl1B}V kjOj54}j~Y>O_ݔbI;A#s|/ aOW8\x- &In+uԃƋ%~DYɁtm+jb\ ~0&쎫iY&-~(;nG|'N`ƒ;(?#GPspa hVSy/tb)xIiT9%AAD0cYzs Wz^_^Nͩo@9G0gT2֝S}NC"2E4AaNE%LCpӗjo&}v^7v *Q|vau #74j\S<ȊH77s(B[aadu>T*JT2qlp@ bH> "CUT>z1"gTK*H4iDhfxfNychkㄡ>PBDRO9R m #g/of6T Fwށ$= 5Ax@quj#8,H(IL _ma #II,E76\MX4+0eMxP`#-H x%\vYx4Ws,H|$VjpBRstTzd-:1$UErU3F5A$c)l&0Hd%FG\m#$JFDTQ^ c=i9{ղ+~AC8H@#8kड़=ZN$U@T"Z^9l(a:7w22 i M>7zV(OVv`IFoіHZZX~Kf1i>pi(衤LtUԉlA)O\<^9YKo?lPisGQ Ry*s(\0Jk2T٨k^Fx[5.#HG%/Gy֞k9Z7mŁbFcZf'G3N_5mve g޵q$e_vH~ 0Y䂵lHW=ç$J8(" __WUWW.G^eV7v"aSүl8!3(n'.ؾj;i=%f5[W;``JP)g,@'"V2Aךщ  q$#ǿhdNŸ@)J'ShDmL'j͵ZЊLeomAhlJ|kw3IeLYڮ} n՛*lh]+'ED85 4lp,"x80QdZ-,.:$NBOkB2N AR-lBjg0t(aBh %LSI6'Y3^KS߬vInDIJ{w؈8.gY-JhjW "Q%GTK  sY]kzg ,&b$LP m;Mb`"c"^1qnGl7%-h jCڝ{ӈO8It3/,m!N9%,w׊c]z A4jT;-(!= &D%L{#b^.$G 2 Ĺ+~7-/"Qt!:S;gs*ւp9$TF,uB iYh4%D*@8 $( XDqy QFIqƨT}mwT?ĕ\E^j Az2Aew|xGxGh]hQhE B+AEQEC$*:ә@H2ImPyؐ7YN;,3o@Ou42QǸָDK VH ـQ65sfrv>ٖ곪.߾$ i^2?M[ÄQ:?ݝ@6kpG N_}"}0\8bN@ _f12H OĜZΜZ)EɼcN^G^h-LgIJ2Lr'(iBOy 969vyCwk5ݲsΝ8/&s%UF0S)A"\4H.r)" `ؚrl_&4)%x4SUR1n3dDgy;{K+|cQӍX. $D:m>HBęd$ Đ qvI3N6!) R)4j'i(j'4RKE5N$A&R^ yT+.M4ģ*b@)$g\7™oS>scv8O5aS럹?``|oֺp^' 8̜&/o;ؓqSZ^?NC?- THgا  ?'QE};3? Ǿ)fגD 87ZOHD-/D׊)W|.nǕt88d)mІ -NoŹ@z궗OU(txO/QxL}SXWf =1.p8У@}(V4}Ϯ5X^ySbr2p]#&Z̅9_{}<ۖE.9GpŤwo޸8x )iy^. %+//fyI`X`nżo/&7Uӵ2rsJ+u]]A蛗:^8#`Ñϯb t~/U䆗J/ ۅpWo^o<1e/Go+Ο&㢰VJW͋6Y\&E&k=mD4Y5^>8`Qʘ[M ?^|q^u2Y<\7ZfE}i~VƅꯗJ/? h!o@8^?6^*zy.1}d Ѫ6q'RrWSqZscJyʶ AMA*m'ΆW3@h9Ԋ"<[K ڍ)$Z8k -{l}m}dAZTzhUv&lG̃<-/Bj]7$5]|b'j]Qm$"*NY|%=zGy'+)h1P|0ȘNҀI:iQ n2˵\x!&ji.*@zgLDq9%{Qr/a|-JY8viG)L;eb<Ǚr4%˗g0K-*4)?S-I첌U%ĔN8S&&)oTԝ>3rF\_zbP-^pXځaY<{,([dVi؄f% TR54rgs:%* *7F+Z%+>u^yD~}wE\By_N⠘sykBf(V\a~#d]}t02Xɲgxu]2.:TL$&POn*4&!ʿLv23Y\ֵ]D\,נ} qUeoK-׵jL׼G7sc&ph`h%Lh2x\YRDʵdT,I4:"9P*FVG#$QbNR /Iճy<'i_&Jm~]9žF5elaY^Lz#r(c;nurGuvMa3%lU;m0?x0 -˙;p#5s)SbUEf0yQO{ijiL ߩ'>ϯ^Yu5(t Bh#Xu)C<)g6`Wp)%+pV|O)s;a7+r\ebc;<(u9ƏULCOǖ{fxP% \@@)Ult e<}ocq{[Tҥʲm+ |i/fJ^4x şO?,G#̸>]jS~K1/XA8I#jY)pHI^FnJCnm[~_y#ȻHj_\.{޻0ԯz/^ىpmKnKNl9Υ4Q:r9ʍ y<-6~{9$Qm FS'сm3~Ro4Bli{zbR~*୊hd'y>I aJZrrᳰ_^}m5}uϛ̗g^6[ =d? (Yٰ҂l򟥔سT$BtЎh/;ˎh/ãXZ0^vDGG^veG{^v**m$G~們) |n6@69E0i1HE>DRe>XgX]SﮇRAG3?;3?>r]\G#u}>rgygi=9^%mji\?22[@%JږVLGb,vr^¥o띱h!p1Hcd֊Rkjb8}i$a̫$ EFVF|@I\5g 8g.}OJ6L*&.&u-e5tvtE o)I`'mmHMr(z|4yRkYH,@eb@t$$(%5Q(xh,z҅^Dx0қz8NkG3M6;z?uL\ȃ 47> "#q!x vL4 gnK[ͮŒDU9X.0cVy1jڈԸ@7ӾYϼp2Ug8{7"kZ ]\OoΗ?%y!ЖC]Gg ć7qsGzG`FMeofwY~^nr5aæ jdݯa?]M.GzI]o4`ae\_I+]7-Û,bژZ>],Z|47zf?pr}C6`I-V~HYX8˽Ʃ7ʽ"6 zi_<h^\9ʿ}ǟw?|||??~iLR(+(g˳0AzK[Ҵ|i4bMO=5L 뮳xxYʭݯqG7>(-Z<~1^&6^u 35V?c<_1@r>^185DCfѪ5=e߹SAsdycJZOƶdE (%6IvH/lN)0/.@g:hN΋9"[S$D$ѢURq `} izKݙN+g:4y&6Vgh&V l(6eOr\|'0=gKKy+Rk,*A?^$ǃI*ΘJ&+R4AVfIrtBy_^ǐYcCd #ɦ$JVfHlb OZO[3c!JqyNZH-݊\w[эV>r p~C lE'vƯ%v?ٙe%IYGcYj?XaQ+%D7-Hw dүťʐU .f]#r%/,t5tf5;Yu3X)sy#/f t*]4~3q]Yju7]_w Ǿ4-&O3R>Ik\JhYΪʺ\ ;jJHo->[hÅm^m߮13ܞ\%ErXXvkYZ['L73QV&lFDDpE2)_IQ HJ 0jD ?2_yy2U`^#|G/// cΕ#"|Wkm6'v9f+4[a] }]1iWc̛} koI~_[ڵY v+yjX^*kegm03zaxr3叧7B~! oiڷ)8YTFc3T?eL$<Lӆ`,.WjR:%1-Sw~eM'}WW\ʖ Wz+I{B1G%C62G%^\ IoIHB32>% :+ KZb1>XO P^ (XHœ2b[Ɉ; YwhK1ey_dɇZ|aEv2|0l҂thR.7ޚKCTgd oEM9#c-F5,$GJv3qwgYt盓!:&RgTFOifH Ơ$G ̺jZ4GC1Bj/mϮFn,-Km x>7.yQ g@N iACW!kyLw9;1?-.Cb/~T~rE Uݳ3/*"ya ow:x?0O'a콋e: }_ƯV=vd CokPM#T:jeC֙ ٤7h׻Wv'Mf/o7PM!J։CIMgWd.7b'$3+c6}'"a@csoy <Õ^Z取W7-dffnmr~Y Ak]M[d.3I hQ`ڭs8qJ2՜Y8iNҏu]s.H^Q rE n+mט N)ClCNDb'HA?ܯ#`mɗl;X=v ]N u>Rnm e k9bB nڴaMpn"ݕg!(5n'qiѵ|Q\Z}in|(xߟyȬqߏ':~oFivOu7쯍j(E1`4:VMCۆ'`s'mZxd°_p>afw=Mދ'=q7A,xJ&f\(Qԁ߲2ξ Rj|? Z0謋MmQ#)&"\JL zrt޲̌B],`CKiLsI,=k7AR i2Ap^@򐵒PҚ3 Vh3Z1J"moFk2_1Ӣ590ӂAl^xmȭ uFLk30w 3-{!3-'/QB UW j)` w2ײ86QOi) ͨO6@ƒtON^nǯ76bAv&6Ϥ↟ƥk3mz?tuM5m1mwӦ~ceOU_n27-vTJdJsU} M֬Y<${TLĥkAFJ`'BDR6Ke"rkyɷ[,Sr))IoDH3 2yJ{\kљKos#BHQk*m?{WƑd 4YG1;XZZEF$@o߬jD۴,몬W/D&XJ 81Za* /GŔgB.QۻޯwKp71㤓gq9ލG㑟OPy}?'2ZजwW#qb>Z'z&B@ge+tj ?l߽rq|כ5LhS1!jJSPvvR$CsbgZ>?;Y,܅ Bk9b`(J>kh0S(KEw ϳT;GeNMD/uLe鈞Nk;9|TOF1;=bu>SklydWӋ9 +q-*hѩ Gq'Iݲ- M1w 6 26lQp?qrbZƨ`Zۚ+6mǚwe:"k|F#Elyy#!ow2fIӏnxvIdȡ+Oii^tSAC/l%cs9Q*0m KP媏oO,Z4+\z2-iM4| vk; Կf%h$9.$gi '? $%^22^ _㩎F&DphiJ#AT[mpɡ[3H; ~o?.VCGSX(-San?v2 fz*b:DzU2#q.// /ּ^}n`t,K؝)]o^+A"kRTyϥILU$&B' IlUB))@!%hx xbN *Õq`DP')/ ԅ10Z f87eB޲'~5{%A@Ǚ^{ىGaQ;&HP81D|n:bUt#A#%`h0&,BdtAH˵SA3iъ 42% 6 9SXz-V "}$V(bCT*#m;ABZVo M 785I*ʃtg\p^CQR{8ѓfU3Ã*X@`Kq@v႟םJVG>U~_g9g, ]Hj5pQ pHI^F9G4D&,%=XrvH{?ed6oRZ Sfw(OdϫZmyrXgC?mc0/ޞI_W.Ol:ZHhIkpF/ ( $SÜy)k}m'q࿽kHho\^zazҽ0{zR~c+~)tF+PJ, f_ UKJ~"}Y1<&r'qw]g.-ʟ!O3L"j g8Ε`UĈ{7/d8u`ovq֊U 6#iPUȎ$-AJ{؎MDmH*QtPm̮l,s(#XRy׹ūx2vQEoOzAXByc;+-@+5v@FF#~ifů'19F9r,K`>3V%|歸Z#%4';z$HѦGwˇpnh5mNTFɻLJ[v=?'rc*->t}" J2Z)Egrzl+ӕj.mv|J-zxgDӤ`<9)uLz#fÈ6fV D(W+"fގfC0|KƞX8?p 0O#vCwNN{w2u{Ͷlٻv0v)gV /=-0fρJS!5$RHsWJs`pAwTƳ uo_QAv pՔ%ꭅ&Z={@+E$jwh85[ͷVZH)([/E'"9[Ux ջ̲W;|ʔ"Q.V6^gZu8C&yТ8xX9rw!+OۤINjZޤAXC! :^qH41{'SMgIk ̟VP3͟'8rUv7wCkt( 42`#ZCW֬Nd NWOVn6)n̋+dpIȹ6"\EƍO pj:Mg3Zx'84")ҴFB`n ]e~Qd|t T29:nhA42JdNP\&JUkZŀh)o3QJճ+زbsz\ij7pV4t{B~ý}t.ԝk˅WBO.=~,CN=X/C}!(`LkVh4 _BN~c=mh,\s崤Zw,x7o3Ou6?ȝ[iL]崧2ʣj2{L@onSP%A<U-Coتbn׺b|oK.8š.`RAbuM3zwE'uo+%=1%MЭԊZߛrrVi.Բ2͐BXtu[ֹ㷍Tդ=§(}ZX~*-"ϙ,ϐWl>2t5{m%{q"Ƙ.qSYs: JRttuP:[?O6-.OW NHo{؃ c>ɸ|{kc$1N޽J:~/gC9t'Q[+W}Do\.$!B"(\H,e9^P'zA]D<A_lQ4n`G G=™|hؒ1M(/.Ӑw38(m.g~Nb.CDW5hJs4 Z|C)s^!ҕۍ]޻ww(%?ơtute8w ,t :]9*Y"]YI+Fϰٶ-rp"kVʡ"WCW2PB@O(Z{*[D5:ejDe]LW몞 af0$+W(,tj:]9g:D2u.jM˛mإK6E]u]?/0{Nts! YZ.T#M!5Lɲbz}.bB2kkNѬ[šWfMJCT^?p #+W,thӕC5[#!+RUEWwg?qhw nNWtuteLK; G0'+p8!h+}qh5K!,ҴRR`#eWϼJa3] ]i]`FxW.GgЪ`LWHWF30`d9x6ur(;&pjZҕ,k;Rc+V'PwЕ T:"ڰ8*_B+Ȟ8,AW*պNnvZrmމZ{Dg3,=&DɲsטZڽ G bνgl;᥻f2x3՚"h7{k8,ơ99Ĉ/Fߩӕ5rhJJe3] ]qwIѕ޴qpBW:]9d:@:z? -Q_8HQ.e Z"䶄&Г7hh2BдC˒i2Ir%"TP4tJ4{v%d٫}p uݎF 暃8-u&zTx(5Ά-3b1^xK<,jVX!J`BܮOTQ'+0޷gg^~y}53=jt3s9^> 1JJ5m"kX_ Ҳ5Lᨩ'p{UęZEMSL 'xV(Mpa#AXKW~,ẽBʣj3(;@dzC}H*4Ϭ㸸ϣqYB~䀔JzfVSl:V/> 1ʞOy( bɧ4:L~(j4v|Wj"Y>t&aYv=*w5 }΁kPKr4e: 0xjgBp hM37U3'^ON5p皣!~;M A:}FoKEҎ x'Ti@5 ^D$T挕%0F!bxFN=!`zp#DmVrr~;KNZgs :4nZWnRx<YΪWf F #W'HMy߽<'ioD=OY$hWjAFoy OYޣ5i!|+ ,1BY3dƪ* +e=@СcV-E=Z.+]qX4JJ*eJꡒ 4FV6 j7ZmwV67L>MJq%'`z^=_>>]h۱=nSseUsQM.ƅ/ !z[OCJ5SL>?gg`}\N bV7fOi9q&~x>~?\9rWa3GJPP"71\{^vF+-9ek 4ہ1m0sWY2 0&͊^v>jXф +DJ8xe eSD'RFR C:i=&=TfJ2'VYiWi<qPerIp,^M6ppd  nGk,2X <,WZ`D袤Dzȥ(F=?1XV 4=_EDi`i\r1%RYxQ6b /I->|'TkK:L4p)ȔH'X\(h|`f7YmpF#1ZfTg ,m`ZJ`VϚ* i8w(J0 C3'XrlD`P+ygekrZ $J\j>k#5q j:IK/^4{ }&53\ 4:9oFf==`Kڏ/O_|p=Bzxh]|kNM_/䭨WkGZd*N}!_7.O?kdrj5ˋ3z`FYC݃*yD%p^,{RL_[κbrXҙowldϵM;¢[h`hw~!X%DsC<$K&d ַC2V4ҲYhB(2RTarXSQs^j~n%6 Kg ,udҮ]0,KC`ea+x(!f.1ĘFSeeL|~2,6#BY OE[xJ2˦7 _D]&#xS3X%r/M T5M(zqݼ{{rS׋2Xsv~x|`_ !V'ٛV9?݌F`}xJ GȉΨqqu1|p~;+'DsMR.%G0a:g9>bWSJg+QVjr|v|XY%Hh( KT0J Zͻ%3J&JM|X*O^XJ`ϚX[%}Xͭq1t ,kdgRRٛAFi٘0X*}VCsJ>kAZ;]{i%Au9:=C?"}8,'cvu,޻CfiK]!煠.rP⸡&ʏbpyL.7rqE_,k{߽Ψ\loYΰB/U4hTRTkn a+xi)*M=hz jfh.fusJ7KW'JM\ B󱹚-HUk_'a#M_mA7rVŸ7n2bB ~k8Np֤1ʘjh+KKJ2p{u9Y`>ZMQa$D}R}~"dN*EZ9k]D[RM`9KòXeh>ADY,6b2Ady [8GY !K:Yi`+;VIW 1)T/&<`"*QiJurTd$OE[gh kNg o%4 _Aœ/ Ì(x3L9#8Z f81v*r2QֲYv"l5t8e =eՍWϠƗN^Onؼl17CNue9OYEW&vK/K ҎBb -X#ܽ9N9,ڦ B%X 8ڂgF2c ,I&2Yųɢ wq<ږ)K Ͷ)CY gJ|dEg_hhb&%N.i/*pz ?!J!ie0Wf\~hmEoI7/g,m5N+&pyWѸvQGD6|] RrmWUjԶ7vd;Mgxԗ͋)X<XDIlz6qJf2p{6r_nn1anE~hgHr2%?g(ɢd&=ZJ,61%s5gJՌbڀ s\츓R!MMj+1g!wL^&)BO۳S]%:0>E{uΈ2!,asq'ZT$ Q5ΰ,5D2e͈"ׇXI3Usd>6L|HLևR\i$CLۛ/#`'5^F;;j$C3ŝ<`U"֐tT&/ubHdw8q3++M$az47^+kN'5uĭ4cO>2`3ض&V,C(~Zkkn~Ef3I b!RgtXpINI\˜L s5d5PcR)JB AmW["f:=x/܃]gs|3Kmsq e#|V$[s/v$.K4{Mu,_3I7חH?~X~ E;=Ah 5LZ-nao)ß_ߟlaz3;O *ac\;d:[:3^Po7| BèW~7|ԩPOzw]R+J>zHtiIvFu뇟@'mwho@[\7<{XyĀ }o\_ }P֑fxiN\ 1Uߘ[bY i*7Y R5O*gܕ6DL\f=Bk͝cٱs~86~g)VG@#v ~W)#xDsNZNL1}L5y P#drY>l\n1O`)7{uiޒs2x+E4 'gRd%+1PXh9SCe,9 -\-Pb< !25-a{oK1›tw]~RN#[SsGSx+y:[s7beX& Qk|܍Ghtxp"^)q?r2uH O-,'DCC9RLcQ&K2oMeu/f%Ce4{Sz/':R$CPɬp!&SL90[okM;>>b+QeS )D e8/G%O5{Iˆ+רC(^hy;tJ & > n41PEN02z?-;H1Nߜ>Σ5=5K֖ ) %K7G?P++(lI+ 9Ϡfo{iѿe=tїϏfYoПz&*-[%zysƭxɤ KDS d@oNv4`fie%xd$ލ ºW)áa29SEopFnZaU՞dUxf܈W]E}=SxE;خg4±: PJ,qIXd"1xM{f&3y"Heާec7^BgVYVVg~˥b,G:sϽ=5速bJvSi4S$2i%DL%i㺾t.&oHG9%T ,;ӱvĻVcG#ԥ*^b + |Tvd 鿤545S2# I+`҆Z2@FZ]5h=iY2:*- ^FsCJ߅ca%m^w5DInrA Na)uHBP6i.O1Im9$ѳVo zH*pY* 44RKһ\%I%2Ggh,FOAI$aP},RM,8'E %c%[JN`%!AL jN5oT  (7xv}*7Y0*0 @P)M[b<-]h^gO}5u98 DV<cW#hhJߍ B(%J8nұ\YpP,~!lϩJv6>A-8;{bh-a0D_ySZ)S)2wWCPG4$hW[a^<AUZ6TE)<pz]VFhn{cw^k_3T~IPj1,żխnNJYt^_'!N;+-te[ )vUޤD Vh0N)RF\J(4tb$eli>Bw#1IogaP+6};ZT܏Ga`!j|6<%!ThxVXA ~IG )* MTuvS7FcsDnŷduq愱fuHWahAxM ykjj΂p2.T^ 7P]+vjϑ2Nrk~}"Rs԰˱hzϒt٭Bw-!#=IG~%!*AߪZDᩑ<_ƻfJ7騒gT@)MFEP1$5n1kLetٖlM4fDn^ƜRzxSibks@8dV{^^P:m{ĉlVt/jGLbE#% u+dot!j#:[bцp ,.ۆP H<&#u{CX6\LtkHUmGK(YZέgۄ߲{?f:7lB}YlG|9yԿ ZY6=&3\ iU YA)RS,JoJ~yF'ey)e ˟>x ,W=vVԀ!k#ows<,wJI D)d-t¹qp@8GOgVZgl ی asCbuPFݛ˳(qCޫ8 [VLĕaYMQ"3c%9ʿm_} yP^<Ԙ,(bX9ib0q9KTk

X!N7$>VĩHYĝmsb=Kk&2Rf \oO|$rv>Ct9 1^ԀLw9N+}wD0%ԥ$K $ t;+Q?Ҿ@l3ziA`>SZU%ȓg7zy H\f*($L.^?7Fȱh*r&dC8`T,Ey52^XcT bn/ˎT6/ǁJ.,`/-Y߷TSgP&%E >tnX 52>:1FZu: asr-^f/OP.ui֎z.0q S"TF#b;)r(X0Wp"&sY"+[0};)= C9@.%TdTù^ygb^c.d.&iV1* a0 RghQȘ$_Zy3 -q1*¹=r۞@O-#).{ LY( ˵y]K Q`Rre R%b%(CcUVAGRR=f +?7$j%h9\~NvHɐU"g1؆d JKFbt)Td96X9 - VǤG4ǯ_LدD—ʲן`UgQ=>F0! {0G/XV{:`FƔ*G%Ę.85/˝=nrʎq& ⢑򴚎_EGx82\~\#4@]Q 6' Fn!0x&稘I6Rү<_*{5N}(.4gD-F cu9*\i^pGfןom[PKy+}ڤg$Q!$ѩEbv.[4쳬1x(" -<@,PeU 쪳&Һ,ҰHСRtSIK#c[#U $lط&~Yqk{Xr] mZUGeJ_"UcqǽkM8߄ y&\#RFuci ˮeR^FX,t1K/H%'kM"ImNF htK OL]Ā'f6aaF%,;k#@MkLXٗuZ*;a* ZņZfFI+tf/Fl0=c u"Le+bQVrFǤ9107qʷwh8X}F5Չm *ppaKyl, L7tx+=|!j Vdݠ߉A4hdl=GE Fy:k׎NȄ*jv_w/>.'/0Ԯ)4e\wZf "9@qD|#Wy/bGn-zݽӋOG G lWo.[/wezhmiU|['{(5`GQjMgZ7;BY<.B>5pJSzN,`cB@aДn_QȊ%J_^A(fߖ__?Om!x&mV ;miWބ}alD|yo9Xe2h$aӆ -s#,Ai-JwGVR̃٘Hy?=i&uUh:/h+D]!(d[VLbN>2ƺL+E]ƒ6ō;<sJ;Lg0u1Bhß DAno`!Ϯʊ=Xx(q~gćChf|`Zm>+`3V/v7ߣux"U!`\(NRWrЩ?Nɋs:(w-QhF,e6Ń>ys#^?AtpInx}2JYi}x?տ<<=:\mXDmc-tF%a`kOdBv$lB޽`vp8ypɜrL}N!_@ЧTX8Vx)I]ԉZ.IfmF3m+iQIyx!8e=#Rno6yzjFk2M%׮CASsSG Jԕ]\}"qc;:61I?fl k U "A'5>j!^ pM_A޺.LFd=RɆIG.(/:N߼mA(u$h_^3ŇR#cT4LO-[ 5FƘ&@iF|7PQi"^ 7H _x-ir#\+ei@lE) Ldۭ;`׎|nNQ!%\ߍO{Bn `q "FVױ=i.< \Vm1og-ZnMIa$ƾ Ba][kQsrdcxa?O(pqv1P)Zu3Le<djȦ^Z'KQGn3Si؄X"UfdF`TS)9.[R#rCNz-߭3i!( }^ {n)%t$ac[bh\ 7SJȸ<Eׯ6AevaL-"RN51 m= +e"JK7 2 Tya P+Rue;~J ypţ\~\,Z+`icr)udeQeju>EoN'/f܋$efȽc gT^^"/W;#9yNJ PQuLji>v- 8U,f >0n=L)Ed-"xSM}$bi ohF^\,0WbE₠t@Q j4;r˳ۂ: T]]tOcC"LDJ@(-u֗RDʘȪ Qial"n-ru9Asʬp4GvН5&ΦV`Z1+᭩LH% gGi~~Ο'q.wˮ&BRF/,\4Dj4RIy$#/eGqslb:Ʒ`pڏcϐ9iĐZFM8EsBnWU&>I?g jd1¬(ev *uXN.ʾЋi4K<_=[#c.@iD-AX\kqX,(,j* F1QQ]{ S WU9/h9; UP}qMpAQM_mIH~52& ҝU9}uѫb}s>_Z#HXQQp+Q#ԇQu0nQ`g kveY #5Oyܥ闀j`WpeGS#Wy\dCS2~U:^J2N(G]c,Dr6NxwsF ҶsgUx\e\x # TɑG/ EK{p7,$(3UO滨bPc(HULYHuB%5bJyJ%ҝ"Ev.[D_|3;tnb-ɑz8C[&Uz(6)qܻyecjjt ~؀'H-][Mٰ֟n2گHY< 4ZJlk;_̒Y ˫i-yԲ+"1qnzQp k)ɰVj$`T-H.Hܴ=u؁qČ`0`N`5 ]&$:k/Jzsu0Ws$QTQ{IĻyP1H3_6̜M}p'}-);1Ve̽tITMpuC 'v[h]8.,Z搌^UxWjdBhK۪]7gƎknpLOҬuFQyl%D} uҜNڥgV|,]j;w04#M&o:^}̳ M2k66n3)Ǘ:Kj. l߮ ֝˜Zޏo]3<~Sp1 OUg4%] ELm((,zЍ!*~}:r?] kr3yw2]ol}KxTQU)_-'PuQ:((k1}8n}eu6 r6{6h|y)~ n`ʹ0?.nΘI6y"ܼ!~>rk9"&>R"$CPܑ )0jg%,c5mGfHfD~} T8?.i #ϥJ;A#+s̿$hgbW ?aXE6Jh$o+?ˮ XWiMIq+lђP-az'JV"E 59N%FUR+!5\^Or|5lf\d\emA a[iѷ*K|dr>?KCJ՗E˟]ڠI[kK#(m 7hn,BU K]"ՌRAAwJ!CS#}*TƺӘL!((^9B G K.,|.Ʒ٨وœV^]߀G%UڹLS{ r_*|.~XZwBRP`jBGMGMXФ艄{b`՘X;P TxA 9G eƛZf.aڋ|aR͵RI;zk@$R?ZHBZ8gk@ jn[*e/ "DۿK[^mJ?6D8Zߎ&aBc|M|'B?'LԞ} s XIM};6t7/=j{jm"/{rラp"0;!U_WtC8B%HVbg#!7SK"ON,&ڏ?RiI,ǥ=5/Fvu7噎 45{ndQi I3"+,4 qQT #hb q;- =Bpjٺۯ_jG)Ul#s]inx\֖kϽ&T<~#~. Oq|-ۻ542to?!S;^?$1w GaW{bq?*7uN a>RZR9҃A/n3)xeD7i@{$7޴fiQ-Ƃ < dy`$w[%PW_oPF9e3O?.QZѢ>dٸ=_ Rf:͆QϽx֛L``dG(p?]fϴB 4 3gr5[LW!:ſl~̖Y&__<G~7 fƿfa]0lprwL)P"ޡI J01,_RyZ#N0T(lRGcm= O.JEl2EEމ`JP/Tj .杗CWTF?)UpܩRG~(W2ȢyG;4 JHlL7飛-t5a q0ݘO81 ? g>`$&ծc0_2rzouo6\bf7j:hO9y `w`#6}W  mB"kp%ޭ}oc8{psMyG ̌G1x2 }C]ß}zF҇%_z|O2+uJs67+&7k3& 9Y%|du) X ZޢeYSCjnHgMON?m!j:0#e۷Ί˧.v+&dSbkX2c%{jWڧ$Fw7 P:: w5)Oڠ| ,1?;H`$%RP):QcrJΛ+zؽDSIo K}i؀)9 CkIhy%Ԉ}!j*ՒJ$=3Ur81A=xJһAH/%_,]Kc]9xx pְ)eAql 2.=[[JVT52Qr54m@O wI5" k1[bZ׀0ciseZ ٩DJebWÌ 6E ~7噒< .^kJqx{5?" 0OctI>rk)e6*TRLv{BhR+ ǵI%yࣛYY%gbVf0 yqSc%'=?Wk|)%i&#A:R8&ƓXL:JY$\nmlv ,(ES,qpiF$FƩ\RQėR(&/1)^&Üm`WBf d}1aOOwBՑ)nu-RLY@bQ⋚dXK+P]KxV˥uəH êwѭ;?~T5Mac%A\ !)rDI^.wW0܅+RQڨ)O%! իP0rIX V l^27Wg7,tS^҆rLUʘ qC2d9,Ek5)_"dsqFޔ/*m`WX48Ȓq0 (w 8vpLz6F [hw/鍲yLЮI[~ꇟLhpFd"hTFInQn\:mrT<>0P^<{1d8b -3;2dL͹T'C0̚_N܀|-h/=zT s7ƔnD{T9{77Of}O_&5-|QJ D!(}EbY~\xoM_&jGXLPFḒpZJ*Lsg/RM-Qv^@d¦{gq}Iʴ-ht>icKٍ-i1ˈ`Yf2'\ukKDRh4Zr7S)2Owi! 2w*>ۻFՔN98XF?g5&*Sys+8G%@yрR`$+C0nxlWǺ"ij8E<9 xsJq|7TIJ*X`Є{l8qx7S[)H I22&4$Iv#+{kϛiRH}+ \u>kOvIԉA!A6qGcB{҆FDOMH j1A I&$͒ļSBx>p:rƉr $Ћ`51y~u$I~ 69,ְo k2M+u) T,?|ɭm+eRWpb(~1 ( (e:Oz?׫8Ί6C˕?wt|$ :qh |i$o-8y8vYⶆ݉bX9?b1YN c֔MƂY4O!G5A Fӹzgjh?FM'e;(1bÉD Y"giJ  4SsS[{liU-^b֟φcN7x>Ӳ9[*5`s'K'߯3)cA$X$A$rLsP=2oTܾ~z;)?zYg]UiqiHu{ !Tl̄yZ ǯRgS q.jpPrf\(j?`F =0B%({ ld%dFHFN(}#u^ hk8+͍Ò.d})0UY1TQS]QD?xdAXyMqlMx[(fE-w#˭E,uj\Wn*.4Ax/n8*belsi>\M`#uB$?9s k)@s2gRF\* i\s&i|S26 zZ0deղ 9$v(ү*M0/@`82~0DJ2#KO3-y`<,F]eY)8*Z[Vw5eDNW}]u.uw 2_(í9[Xܗf2l/GSbd\FuE(XWC0}JV.9`ַ,Fј-2ZƉʨ`\-|ެGj}WDprߕ$թSuT1U~lV6+I DDa b^y<Dȅ8)86kA ˮ$+W%I_=6xo.j,HErW嶴ݕ0x:?# ѩ}]fqHqqop8d[.Uv_<揫UDr@NBʡ|Gu PMph )h@;S8Ia%#-* ㈏޶! 6hI( T҉*+@7$hpsWz:o؍`h*F%GO&E{؊| z`O:ӋS6B[m`R4 lOJDӇR[xǷDf`kHnqo셧b<%ye6Q\|ܐu.B;_Otۙ+[9ГNk>?t![d ) ,MPQg'QPe\LwVC/Z>6'kd?ŐsSJ);$Qy1sX#B2X 5!HڕrƹqKD{~[/CNw'Ut`Ot|T%qrAKo7lY/{JhgskS<],NR/O8;Yr:N0@ ngLޢ8c_O~b-WP$]n>Nw l׷RwfN|_jn^m)qNg)8Ƀg~7ћصLO''n?MR?SS_['rONf낛id&Eo]>CDҬkjriq1 78 ŋy<$|)CT'ޛn@L~} 2p.Te*R^az&z!MjsY6/^͟G>A*q!HkA!cC/q2ķO2@`zz +KP v+cGlNMOD)n\~`R;~MT{^hǚh^x1 zU #L,tT0a !'HHa|y>=k,dߩKjf7GXW Qo!\ؘʛOɵjc\7PIh5{7Mr\NAj@b[f=GZQc+l/|!:p@J8qQiT $8,Qk 2<4*m~c_m>fj7<*dZ!s rdR~ՔZd " u-`s~8z\uGPj(H%ܲ܌}p_2p_EoFU m yA긎Eށت <Ҍ)r\JB(NB*3Ƭܛh8 2%{goX볾Z]-YW cZۗLU/d6.u>Fĉ^~dyi@GC&k< Dd5\NO~2L0 9J|==x BRq!u3eRUYƪvBK1!.3[vx6A= .Ntq #V80CO]hIO c%V%$\Z] `6, ܬpO> J; \H4i^K\+<ۧ6QzJԯ6968e?c>??ӛr~Tܰm~V'<`$j_7=2BDT*XΚݺք/+SSY! *Z#;AH"k 9QLcefP>y6da!F_bTۊ,jܝJu=o@J  R{vTBW[Aݑ +V^FaXKٟ&1"`WJaYƈBATC7񇚵bT w`im *A=?j$j'p;2?QJ) :~b$+Y|!We}\Cm]\[RW8kK32)yr27xJokVzTÚK܊[PBZf jgBBa)_$ V/q퉰$,!-”S[XO 51K9wbmlyKTI$σBAX=u!8ڨw+ =o =LD WFWƽ iPDd$D=;L4oy0ߒ㸰<=a[VGћ  |[ աN']Kڷh8/ /370~$Tx;KfҜ 4|vU5xV91DI׃ ;o_^|_;UrK]vf~\5겎j]/e_r1Juy-8qz^#mq7>7Jtu"~z3`yv$DDDC1l2&cƤ3BD{/>rZX6%Cqe\"Ր-HԔj^m Huod:4{Y9 Tڜjt9XR)}w:V z!V[}DPj.)o(&Tǡ#҆Am϶䪴PyKE{>)aZ!O'^FK!ˌJJ# F"`o;U2I2ZΧh' *Ytuv1j][*47!nJϨh%.ݪ_:8$lͯfӹvK 8dYIW4.?L9WQ&R*zP\I2麜ozr .nq 80Hۛ_~?糕F>|+?&R`4P(1cF=J(,s-j!5Hz˕~g5;N-BDxMwj"Θ!LCM ZS{SQ/.*3Fih*f9)+ 9{"qǠNG}PZp+@Zp\2ňdXKH^'igBp)ڸI `0 Q9 p/ؑ:j f+_DM-S JŘw^Vtx6>C8 %C4C$F{k,qR Pģ dE(#ǝהbD}ܜzH{*a42 /mhO3}r:`ϵVS g(DV" $DǥwL1ʽ*piP<6{wy~A:]<@3`w8gctqإA( .U*`'X6?JI5!gOSaZB=V#@(DxAH[F+jƠ#8&L9g}e׽ee6jw^.WM9'Wze?mi~dm -9]Nt#0Co1J2n?8ŏC_=c t;]́B{da$yvqq4aQ7/jD]VH|4n;绍< 'kbb!7[/](U}^`~ڎYu_dF#5=R#4b9E`^`OH]ܭ;g7}yBG7pQZ^l՚I#t:HSU8%^Zx81uR9nh_}'_SUD:GҐYMESqcsps&C0:Wٞ5U7(fW{U0Nla6 @l%-⃱ldDh#–۽b{:ӻ GD@^7)rW#N!$#&I9. =*@R1r#Lgz fb/)f ь/Lw3C jiVChtYD#8,LQɓ)han{qp7/NZs|1R p% j~g~fǎQD2"(huT2 |:&nE7JRqpfjZ9*+"Jap4 .MfgDŜe*Hp c%~Mfz/{\z@A|5Փ"G^Q109I䭄XFpe'CIDZ31Iob-3Id9F/RB؃|Tu;d^#Y`)6t\:vڋIqz< V_tq+;I;x~v|~aŻׯ^=/kHj Ll\zl:,>f ?HDWNαtoĐ޷+w\|UUP¯s^zYs51leT^j#[[Џ6xD/geE^~[yhf~VX*.>G}PPr8>U'ĻxWjzָD.MJ.:Uk8{Z)֎qs0YcQso4+k<;vd)\dK"[nC-1Fe7"[7n&5fVgX>7BNF[ٰE->{-Ԣ;vQnE-.jn,<Fn$.F}9Bn$to9Q{/za[[0Nh/Gp}}hNX:/?= Vszqd>:=ouGsƯ:韄Dߏk+:&1ƥiFH@њVH4Fe|H=-_}vFG)U7O>ewe0+d{y5::FGZ輾ej-61CB9B7[v*ӊRLKM-so4kܛn.90D"Kv?<@\$s>xJ3+vHbWBKt s 5 :]ӌc .\l~~yݭÇuP#mJV*%Q\rnΰo:UlKFSIEK7 ΅9&"S$}u̽,{YSd+;Xhwcv)e|AΏ k;-Hs\f%of:K8Xi\Սv͖Vf -~(y煣HE/>5˃jfU8rֺƗB {1?~y%0ͪ~Ω|P?} )fwN_.Sp1)fݿ|)k氏/rXcp4s5\dʮ j/O]Y{˷D91bHoyɫhMZQjʴ:hƮg1uJ`~UHk08!+[)6|)Imso4+1;,Gig"1IJD,-y[[K]<;rя$ol5yoE<|dǮZ:QW=QFr5zZ8}E` F_;t`fѧ >grЊU%U`:OCыZ7SfTkt krIqj)3C7X,KMb g=~c0O uJZSjfȆٮ{^Ns8566kUR!mKSmS@̈́Ts=dgJ,I*5a{5y6檩B*eZfm-8 ŦYFZq܄r6%2sVlNZ`ν;yRzĮIcAh`=P5nN+FZ+:d0.-8JF 򛰖L36%XU.6kLʆ" fdtHiY;t.|aPjcK` x)xho<?r㲹$KՇ ev ɇVgk?-$wW+f.7ر來gb=%nGq`D^TJ^+ZƩI]KnLo16L[NF_c(oܬj@H(}{KUhIBt.hǐJc@Cs)-nCvj5~YD2 v((7GvƒKj,%)]$XcN+ _ѽ* 5/'˟⦼2f@'+dD$\x)=u'BV7JǤEϞm/2H"&"Hf L}}(!C~肜ȻM6(mx0]-Ȅpl&pM!WG;nrBpʔs)+n))xkItZJAē GO 2(Ŷ}F4&@y#s֨[˞׋Ue,5*cQKXj2d֒z,y9ۻ9 ь‰eVO?*X⭁PᯇTZfXF1GӹPmidTtbP`IC@uĂ+xrVϹpsҺ5E-SoN&O+Q%HwkWS'e(qHٻ߶}ݝd8$[`E1kIlI/G-Y:|d>h|!C{ݡBu1M"&/e &WlR'rAãr=9.B"YNVxݦ'3蔘݅ ¹Pn'H,5ơ5Bdl9QG.*)*B*iEʱ,I4,\x,/(>ae"L1OFF'<5FɺQ0AOZ`1N upg6]`ߛ}&|6) TTjٓlS n [MdӃ9m}6gCEҊ !A@xʝ6q^(/tNO.TLJɔVUEsX"4UqkL?'`)G+{PV4f֞ Bzl \TǗ|'.;;YD $T>m~Gtvr0xokqrmmJ|Ƚ^E%di!<Ci9F&F/c7|8cK|Al[4XΓoЀ+ "I_l^`WU={Ϗ@O0h /oߒv =TG)~Æ*u7MlEljtYG7AG^(<..ޚ"ĀfzR3E44#5jDSE8V ًP ==EN5yG6m^drnUt֩3 2OiM3dN%yw"$CU$ d,?~S"}jd2I'}]36˖uU;zaã'VBĮ1M[ݥM';zo^{Tؓ(  *'T]@9cKQ?z^hJ~u)hmz3dfZgGbԌUb.W+?be{q")ia6 bOO9@ )ֈMLBzB!ܓsEHЌ)$zs{~ u'vsϳlйЩ8߆{zsu6l6fM ,y^o߹W;I7w$v$u $AE2hb]%M@jjȡ5Y66{, B\ج*fQ 0IN`SAh#7C#i(hnhafz[ 4/+z-|D̟`bEE>@ 4so 0!rk-+nǍ{& \t ZÜ-b_b=[A sx?(|XMoӴ yI!8'jϖl|h턌,r& 9ru;-]=Ǵ- /Ga6N7{Eh3ގnF|gߛ1NȻ?aw#$jSgg"HN5jJ뷋Ϝ˫X`̇%IIZg/"vKSh9o(90')ܣrؔd=[Z< 6=d{w-)9]d{0U>iKkN -Q=ndD`PPhB( u|-:H rF62/ : K}~L0~=ţ;gwCՐp:cъzaᖎ s;T+`XS_7sT/l X=c 0qB@P#KD((`溥2lùC]ǾZ=\u;pk-krA(W(90gDSJ|\^OwM/T C?&dg}oJ~ԐȏDO Bb! u^{G_/Ob8\ӱ֭]WC,^e]~&(cc 5VZ'>a٠zBLXƓEF`gu=`$Ł?_^G6loV1VsD.ءB pvKɁ9@2.>sZc\_].vqS:>0Xfa%=;tetw+JɁ٠d%JZ (w*o`$\ۈz-_ lך[w pwj? #ւZv*е1D=:c Eb8 kzWF;mӻ$2puCilxJ:Vܞ[h f>v% a<6U5ۗ޴I}AC ΰKX2gX:ݢGuN=>ȳ1>AᖒV0L̰V1xNex] UK 2f1di~u8Pr`NS G/}OQZu.dms ׆! Q]<UrPc'u̾1FcZ} MHNYásE9{( W ֳ_&4ւÍo_sɑ{O?c9byb?\E$qhS? EˏC/E)~geZ՝q!W;po)m"4>fʛQq3l}y~FdقnQ^O%DBx+A8"mKT)@.K.;(%S3YT3\3ZM$h0A:#P]B7og!?o6f"dԜݨZ-!L-4 V̑XdyiМDWP7qM@Fk+ïaH-Ug7P82Q*tlan: x{<5(( }R(gpR.FUC%Q)%*A TciJJɅkԱbn-S&AŚʦ210%L^{AT Z/[/\y6@nч#8UwdbHg~Y }+Bn u+8ù'7׃T} e*;6,2;`E(/KQD&$]Υ"1bb5Rp>S%W"5][s+S N>n::d2I99~Aّer.vO&-ѭnwiѯ4x~'EQH ac7a4EVz9f0o7=PNe˽b\ !u)ʇԢHk)Tt :~Mtt:F[w#BTyP u7aP||^;3Ӱ6mdB8մ˘';H,rPxӯ Iӣ/kYtEI-t GLB(ފW/W( \z_Wh|uunկ38+Q!8NXTqq1AάHG~[^OT0g);Tw֤(Q3-ã!Ny@_hOAₗS혧EQđl"%K{+ku4 _W;$e?|~d@"BNVi]T>ͷCUN#3TT}hc6bkBΚ%4eSVL/YQY:}T\ żP^+88{3@MG~ 7 S(!Q>J16˪VUUtԑ;sFg17:ƄX8j[be9';-1#R[>u!]oK~YbY]Nȃ2"j>y1Lsܭ̕-R[<(Bՙ-Q,R~'5OSܬ(51INwO%h?"3%QPS-|65݊qEj: oYtaS'P{E=wbRf7b.}Bg}:Lg#bܡm]T}sBt#kż{j%EՑ82=)> IMdV]u.c;+j 0B6{F\z! p2,qIq,߫J~ya+ɭ^qeZ4fOHKؚn+4~}'ξĔiN{Lvx Jɗъ#ؘZKa}~P]"TmwͿ:sI~浒w'P=T"u;`Xxsuxhduz/&%Ftٶ$MR)s'yh w^jFĭa [W-C8]TE%z <ڑOo68Nߎy:=w!)9Q3WDGX›VYﺻ3扤R;SdGLu+) y 5iGI&M-V.нIk}] =ii|G/= 0?o<HOd̋Qמ1%ea*JmQm::;^?X8o_]-eV\1.񲽩e|ޖa/_?TFm7J_m׷wDx'9;|;}j }MYvjlm2BM9r31&or9oLyIy}}aJFSԳJ/FT*Dփg K'ܔή>CTRtҹK#D&EҸbvN%j6ja6qXk5Htѽ͖SWb  >b:Qf )\oa^C.+FŖCIc_g{ܘC̞OqWTĹGAG+w8-߬;Mqxm>XigO y:@#oymB|v:VE兾yǔkW4vRn<OO1V]gP[IN^S)i C ֈ)o\:!RLZrǫ=Of,ҵQLӚXU!9 7HuM{.d9SnmOT02D5/d>D|\ԡup혧nxwKv+ɉk N}Xi/_JuB?ۢT)tVLL}l宰rV ]{u)J:V.*巓C0zΊYdݒD~*%OF~m Ȏv }*ѡLD^B._wdIdݑOq*w;%ҟAY1S6Oߴm^'L_H Pf$xO BԒ/8AXM6OM:Uhmf8- t(zY,QLeJG:HyIY]?@ymz7zDC˂;_]u_o{s`H Q0/zVD'd~ jd@1RmATM=ޛ!}\ckt¾!{ 珄Cޕ@1Oint) <σw&[ЃZԱ#ɐ GmF% qO]ʭ [f٤c{YIT'6΍`?qpq9I#2Gn%?y,J;ҠR{VJ~d>Jy$<.iTDPX<oAV|85葡|6A7s$%҄C3mqM7c\~̒C~Yz ?$^vNn#'f=I WzңϋKXBҕ7sIph&;Xʎ&͛WYOtGgq0eFEs_I[THyqWys΄T?aVw[sf#hI՝YFKte#-%ckXyG?ˁ.!T}O slutۛmb.Гb6rIIzU5{WW.;PJaɬ/CǼD3hٳHWCKuy7w URZ9-Wp:+Xn;p5}#}.6@eRuIXV!d#lZ5A+6rɒp:wEBK+I6ּ=BL$ N% v0G!S1HU+ptV jl?YcYY"Y lpһk/bĉ- ߠp()}zR7iS[BULW!ɸ 84:ajCkp ઢ0xb d\k]ZZjO;V:2T+d,aTS()Cqejt8Wʪb +BQkF)QK$=Fu3"8*D2XDU~Қ#ɉkVm\.JfWHr,qV$$TۧvKuu9$ޑĠD1uWBG(G5zIХ(E+S?Mplw6g!>o%,ɪjXqFSvgG98i&mӶ@E_!|GhUިRս,^5^թ0kV:kB V|lJ>N1YP Uq /I )Ͻ.YOETd`/4}꟯7vrWõ{]VgE~g`$y6!)j-:SRdkK)|mK?5Ų>=TF )^G9t}P@PY刊ԥnBŤ`IRV %KtQE(QAFꘋZH){P!"V^pf6ݭ fY[4>l8sN?Mhhj}DM,sJW0Ųٻ6dW~2R?_! 쮍8&fa3(-I9vzPRCIfQÙsq@J#M!%}<ʡUf5l:Xwym"a–ȸ4ÚĚ^Ԧ-fboߖle~^nޘ24,0L!sqwb)?u*۝rhC3:ȴ9a$ukqo#H)YH@ bYGDm=B$j[Y9ekvhEYek-L|.aAMZ)[kD!{x&S)2D@(Ï=^pIw4ڌ]m7Dl r)0F èݚr_} \*}\a+ Xa;i,m|e=f5k~\[Im]F >LL+CSc!BkZ)=JE_ @k%HE$JB7CDO/c~$r|{pWҨ&&@P52Cܴ;z-Jj`^{kmOx.aĬ>Ifd5 %-j*:@9c6I |6GY032m<ծX.NHh'T䱛CQ攒SNο)#$M1S)++ !r!ZyZiIGL.AծO $v;hP&u5y*b %7ga ڝӍ1*ퟠ3A;/ ZVd !э+;;h JQa~]S"LInI '`bYT7!Zj+oZnzpOY:֓bVWK5"c='pƱ\vLK*e"u&UdM*ݮInդj;`S dHD|GBP QEQ*DAXsTD v[@DN8+z>O Z&EOhSZvWZ[O $OUM`Ձ6(jrCTN&ĺ¿qN:bE5JXk|JAbMu$dL3AҸDB=AqGL(FZcB[ ~WFB"ܧe̅`Ғ^zr 7A(+r` Z QIHesd:NP /\sAH|ͥ|d$Q45d:@5"8}VLy1)L:H:G @޳"wx Ӛ8rys%(K3@x!L! ڻlQ_)`^\erRQys g0'Jp,Z"…g ͞mdQ-f1#e752sx[^-E܂60bJ )! Rsma$|$\!X:7x􃱺cC ȉĕWoPse sʱ2AIRĕ׸>E ) r,աOn1 l *MaV7Mv'eԐsv g2I͍JV ڭq}r;hJ'h8JEr5?DPgv #+ 8}z%yf_e Q{\R'8N\2|ϳkNpE1~WPE6Ƶz- 8" 1{[D*>qv-o0X?8d`e$J+0- 9#fO}DZJAz@Ƕxs6m}᫋(w WOe բ:U#~cy(]#Exע.3ߜSj<,g,815!{u蔢]\T#ʃk>FR)"vWK0,%V?rՊmeW FP̥}Zv_x_|A\\1i2(hp\ga!@ W>4[uT Up%a)E)hrVEWK*R<A{o/`I#TԬ&j}Rk|^d_o1-t3֝_ Iu_4MY4~XYj-_\/7h-%pŅfp=$]i`0RAo;~ŝ L"y6Im|n6n^'ZJ/:F+)J)P^+є3\Ѭ~0O;+^(J̊BW껿OfN% ž( >0֝_z+L|@HQH3Ȅw.ȼƕ.W4wSM5B>G JoMG-sA\y42o 2bKJD:z4ɾxݧy1-M1Oh4Wwf/Ċ)of7zII9:) WnQyWq6w/K&T#oљC{ſҬ~ zͿs;8M4͗15L Jt;eʛ 9c=9)Kl%Mƺ1:exZ(uDBpE2M0ml ZbOTTNG:`|N,w?|7W"Ks3Bywwm&!+A$\2YV!7G7Xy>su$HJМ`M(Pm4@Yxa"E/s&}6Qt[E7?ر %Ju(6vƓiV9l- N;б:3F(t3+M/.yΟZ#vL>Em0$:ڒt&~x#SL@.(UfDnq/;Z}3{h\CE*ZB@<ʘ2տ鷘PԎKi<|q12m2nWKzR54'n?c'Zk<T# [ALk]X.X) +s7桞먖Gx|4*p/Up5zyPK9#(f fNO%݊w/NlW}Al؛Z7Zwv-ĎI4-M~wMC3t._B{ɗ pľyk҆:.>~Nܩ"RFۢ=]yןHRd'9dG/!|ƉpPAB"s(pT{bπo-Y &w hs&b'hw&|ǴTYy(wS1JtkV'}:_r\%ױ~u~vػH9(&!jGtJԆY<-(hR3wͻR//&;W7*a5g`i6i\><\ooҨLLi[hاNBD]]{9N.erۆIy-`ZA,{|cu5˷ PZ*í)rJ\N :0(EAa:,g_aPAmYPy Wh.+\t=[\yQrq0.ל*2d9tsǭ Mgdm`'D-VXD,"G)aSˀhTVj{WH^V"h_bSu,n^EkiB@;I.IM8*͑IVPaSTRbIȥ(C[QT[n~|У"Mz">{y2l(vÔRhRi j h\>7D)| qhhPD mv4$$'7HvO9 =I.9.mhHPk1H$&I!A*'d ᐔ!+U9 ʉ RPi],)"&}B}=ҟFkȗ]q՞j//cM|L<8G,t`N9b.7Ȣ@.$X _hoshYu"x1n&]% % S/1y! >"XU0#9ER}elK[ɓWoT)-ȳ!4eٻmeh|MNgrn2iz:w@[$' R)YARXmHH]. `44ni͵*B2,bOܽR.Y"z.TY”s!̐5`X6)G3S}J3cOkKb7.\ycJc4L%JA=3jar,GJ,5:ym7A`Z-/4Jz&(UmhN`WBZ, HU*5Z|g*h/ STh)}W 3 %RmP 6:\T~=&gd5tGTNekΠ K籱aXZ4!I(Ix *.@-t®N1ܬnTohʏ`GQB1ݎӶ"Qɱؚp ̺pG1CМiy52eIJBi"a{#W2sJFG  TۺխN-(2x@, (-(\q=I (IT5 PC#%Q}Tf۽ `EJSTBatF xgIij+eY1c8 q %}  L_(o 0TmlBZc(G'?-Aԧ T|e*JnbB?dyREV#H,V g EĪU( Т,{S\y6rF.BE(+[12΂F6L9e{9OHV: Pƙ.E+7}ym0( 8`mnnaJq%"*pJ`|q Vf]X A6&<!)Y"\(M(IROpaxǂ7bJ"h@(A^o(Gg (kDݛC# Pw>F"g 6؋؋(;cO)cx3Da *ǨNJo08|nFsӗgf+4bs3*gB +]?쑓&#PFion"C۵ d"w2 Xrx}ɾA=E3!PbcǥNmp I%ڗ')(aoxf6pJ9t#h0"wQ13 : YB'iY0DŽzI醃6_45zKV՟H 3`Ҍp"^wi'$&&TnJmI}(In[ Ht!Kc}4 ĪkgBrɅVibRKژkvz;$c"cjZ'ĜqHVcɥmIļi X `7/D !{E_aD`;%|= Tj6FJG%7xeD '{$ E7;jXj(pqQ"5g`џ\_Y>*\"a۩˩Jlm )tb_ I1u Rh; 'jXD{BR0URmZ S5 5o! r|K E&e*ʒ._ @YT}!\N% $.ML%{ui2C,&ޢTgKyaYb/-VL ÍTZ@]*"K|(/KoM  %e&6A1䱯g.Yvid:Ⱛ3t9͡j|pPŽ88Pik8®zqϡn|U}\VÑဝ,CՙCP%>PM, C=8̡f[F[WJ9T;r4R FV@i5hf@ĩ# ݏin7#:bEX1PPӺHPs ~48l5]x!P{u_i#\)ڎj_A_vFiIKPpEM Ύ*GW;ҭȾ?Gwy&n<_#:lʆ^+'}+ltV!qA;zza v# K$F_Fc?}V؞w23Z]`! 4\8`<) 8` ~;}W6C K Ƚw3ӼMȋlQ44u86jN}y{ f8|&s+zpGgˏW:xx3שOV7I|1.;p:tz\nWg']z~;?u`vw$`O C |҇t@ 0&0JL?x>Ͱ D%|ܾ?. ?rjL ~g:nwy#k(+/`]?&QO0h;]C3ao0,_v(2akAB"|*`/XAM|-\LOHf7QQ?GWz!v&IS}-ZuҜsΏyČۭ*kݩB6 !5Q{) {8/N[:mYb1>#<ɂ. f4}ޟ&,k]qD߿_ۀ )~'jXc޵6qۿ$B~nRR8*;r>.<%A %'~{ X,v .Yp1;s<b|'>H>V6>qm݊ \I6@<_SWnH.yDThp &iEBآ%fZnlY%NEW@<(TZ>>!_?~>&QlYyKSD|$beYw$FRX3bnohEmR dI̧:z`gAPt8ծ,@ZxVd~ѴZ,>O˔ `ja60eZ;tP]@@cjZ0@'Z j/PmpJI@x[U 2XnGĒVXhZ.\z;zKtҪ e)__0gU/3 _˰浳C \=xrB26mJLS@jMi4uߖk23W󕛱_,vYd<2mn2kkإ?]]x..)8/j".[t|9[۷o^}ׇv* WІi pK,r|9WuOa2Jdpf6KezL)G?-((a|{9| Yqef^XeDZ)'oZ I(,TNm|wsZNxT 8T*WIj2N+|ɬȥ!h,6J'Eѡ( ۣ(z[!RUw8+wDs hwrֳW5(b=;AO L!RraBtUtoDž6RvRr^t᷃A# UBQ&Z`r ACLƸ],W75rYΦBxd FQ`BjF#v1/VX̑&4hCP%Q& dMF( @+`d9Xa Q8Ssx,xw(6(r9PwR(OI 6 U* CC^Q t^~gIaJ).Em; 4S~R6Yy69KœI^F{6Ȃh.ލ{n̊UOl%3\tGX;ũCm:љ>wӎdaRtN5(hVFӣ{D؊jwtj/ *^]|Hç+`/rdQiF|b#oc#Z^gS\|W|w2A^<ܱN~0k*:~%{Bi*پ6r[>|yae=-_}ϻq>O#Y Yv7eg1*ɉͿ=*sbRiUp({s( R9ʰ5( SJ"䍶0EZzIҤM5Gäߕ?:~ ӏ`s5lz:N8%!Kv}yFf\;ukA}Gv([Z.4䙫hM26c~x bx?i6^%&^罤Dton ,k`b}X:NY0G4A4J+  nA+cҊCt8F^5eG F5,c&4$"K  i%xRSCSIXbwt90jt\ZbM0UwʙLBoa,dntJ#䏒tNJ`(p$b+5jԚFtFs(y٪uN`(`X,VF @'j$dTj59)R8d]K}H~ cБȶMqCcd&/T&uh09ZJ@ߒf?3ݦ@?@f6+JpRD:7YVx}i/~now8Զs٩b6ru= gz/%`o Ҳsx:l1}]klʆ0I(Oc.C,9Mp#~R]sL@! EYX~(Ocg0ᥕ'׊F8G8j1\FkB bI^ʀg`TX(R|5 # Bҡ 3D"AZ3X ^r0:2!ImxJ ?D҈CgZ6"-vC.p`7F~A%ڑ) %"E֫~8HJkAVE <`ɒ'$ ]Vj+C&PbW7yfIF#k?D@FFb  PYwB2X FOu(~SML>A5\Ky'Xa%R(T3,a\SS9m̉`6WǕ-dWQv| tbӔi=B}CXKElzx"P|+ŇL–+G'B`"rERLM DHBFB6;ezdo׼Ȩ7=8iH粤sz`t¼V劷x!3X<)`E EmoyF=L3ȕt {00yV~Ռ`#xlݓ 'lf/|]zbVu ¡np3㤵q s] =g)ֳswב GeV q0nvw]$,rDwK$a״>Z1sU9xv.iժ|M]Q?6ڂhH\[]xF_ݻxLRTQT8uy]Z$I\[qot59ux:]9_X@Lh-^C B4.ܣKۉ' B_PZ f@ B8F]Pz WJZ)JJWvٚϡW >cmkwi{MbzU`ʼnŅHS-/ c/uV瘰~]jS:QP#YI=+֕"j#SK>Xm@`&c%Z$TIFS*;Wg,93> I'!EC]këb&*MZDcG3VI- ϔaB|&X5c;`Şox"?xEBZ l3ڴ<.:KĖ]Xn96+unTn2H1gn^SX[hMqzw([ RL'mېUv:wf4ջua!_nT"`Ko<~NDszyMY:.ߖW/NߛYqziQH:T;]={D'GRܜ.C<}bmB7p@%bBuF ّM{֚L04 @!v,|h5@%^R8`h J@%9"+b &~< F$v].a-wzasX_Ϥ>DW. oב*kWޱ6eG-ev'FDȋ?g ?x :]9-h8Pyhlå B3o2ml' ҜdE2oZ䃶ĂqfJlCA~"#F'LJ۷{`juyd-pY"*cDeWɠ{?gЫbް1G0zGw6ِc:cT $_ O;hzt B<tN$M d%6@VR~WH| h1"~$ , ˦xW-f9I 8bK\9B:8C;6. d&pM>deyuZ'#6Uqvd/d?-d!0EW`3Z Af )FY5`i)D(Q'DO`?#"K9Xakbh$EqHk/%[j=NttVTZUdoOúQto9f7VЫ5F)?1 .KNxG`R`j `R`#܃Sܬnf2LjFI i1c )whfhwf4ջua!_fٔ`Gqny7M+z\N3x!WUrݲMn]Xn# AӒ+4톳D^&Xi,Yni{͏eܡ!ۍ'&q$xD%eC9Y_̓`jtwvruyWCGmэat+BuYx'kmawsW 4c+ 86Cc-T?HFT;Xu2@'ΆFQ6d(XQsS(FHQy۷~cGaYm^1j5NFܜHϱp; P$(w?l븯BC}QE=%įQ|gshQ"T֞-QG_ϚG6AJKJB<8*6,٪W\‹sEvc{35BXJI5:E AZ &e!"@+<&DNޔ}^Ex4A ;vrYǀ+4ř߬>쇧[ƫMX;+^˨h-||_4B1 ~Fbq=E׳ n 4MD`/NK0;mhaGTM0"E'%68p;/ˎ^/4+b#L?{d9qGѨ(L4Bq^N˿?‡h/Gy9;տ^{w{$s_7mdœw ͓_`eG']ko'>O'f?z}{{;H x[!O@$|W1`lo`L#.:P,)9 f/ZIByA9I&^|qqm]{KiY_^]\ۿ\LWٿ'b 3P\\O "bx}dr Ab6_06Z!N)'1e $0'K>j'oϧ`'_n z=7qg{w]`K鴘[fhPw}OmwoVX]TiR/*) m@0`MQ(X/2`By>b`H6.'ED%ܙ6K.Vu;V;(K]Pa,~仞KtBd$"*v9j؎ƍrd4 9v{T (a!GIC5[  ;O)TSڶ"\lϢX׌$괚-ƜAN!hYP[qpToj %‰1)^7]$%Š]9E!cC ܙ›/OX^%T+L»|lcbyL>fI0^Ӫb|} 5AAVdD$e OD4&!tusP"B8"C%`W vSQw?DN6zܩmlfMfK*m̠SZ|OE~hvݦ6PHq-C$Fu WRCB*'QSΉB(Sj1jW^z&M fqhYȵqвG`-\,R^+Oe ;5b8| 5O^6>8\jNF5nz޷>Ւk^tkS P3iէڅWBIB0HC6AbQ ,cZpA8:*0l-!=Ƙּ5RRL~ /[g=71g>׷˅:a:=YӬid?zeT%sf{c >YҝMإߺGs&j4y%Rv{N0Z6;f97oMg&=񞳰ce6#!߹6)&Fkj7p*[S ʈN1hG))4]k-ڐ\DI/$ bPJtQI gOEZڭDKk$;F2sG s92t*m|Ĥ+kxe[hH>yȻ5Š4v0o <-Z5hYֆ|"D#u/X"ݚbPFtQE ޴[㉖nmHw.2J uK,^pz-Ztcr#XN \+\$ 7\_+|{`}p$pᰁ.4|"MAYۍ_&32҃Uz:y&wOǸK8mo.&W%! M'H H)R_*aT))ή>쐊O"孳C$ = Co^Z*95a<>Njep4N+LoBI}4%hnd%ķM Mf&G`Ғ`[*TY2ώIb{ ]=kPr 9 z%܃a&'ՓwsR˪0R bOd>3Z*RGqHw?viXDȄF[]-( ‘RqhC#>5V2 Hl c^KZ0[@]P[K6sGA@iXյ׀5ԒBv-Q*`JZ$M%Y߱o3Ģ~*L]'eIHQGȇWƓG].RGt8]jJXN]_j{SKok)-xϨ.~*˟w-zc1+>9Ԙy@ g;>*\:e9"DO;jhݯRJyq8ʸJvE6be6eBjܢ Һ~W>ԘW լO]Z7g^bD`Q[qr a tb&6i×dzt/텡q XΝMcc?iQl$Z'~ ? VAh1sA`{XXfU,r靍رV:3xB.u˵hn'җz^ p UA Yyߑo(xߎ?0+H{ J]dWj]B.Kvtym彯.ABO/S5䂎;'  iI ;ts)N>IK0kZQ2jfoOgApZ_P{.$Qc]'*[Fc7v)]쪰 Ӕ.I=垖T[WƱ3Zg&cgmq 7&;]o5{#3z4pbOdD2pBaYȫlR="JC {?hXE\>a%&i|}ȃ!)3X#W;ঢ়ɓ%ް0Ű5>SGB(4QeMlv:_|0I6&fOlR'YaڭRAz{Bx :2=%8m֠;GnDn]b$n1GI:qbUŕ9c]]Bt/ˁtPy_,Չ.ӈwh3hZQ3HfdF.}dr=,tAWC5/vt\C~] i'j͵C@L**MݩίB8J\J'rX[EPO 1 ,x~GoV>ٮLgk>޻-rYoR:fvk,n Tl&nєt"*,ΧUYH~J +1Oh>5*hFr;}? 5i >],/3wF3pf֏]LdtВ6[%qOL]Xbj#5MO`jr,P iPu'!hMpWz08٬O/HN&ܯ`EV}I+-jBwWa 8w'@pI`ʺF姡)&{\O|_90U+m66[3ogD8ڲ{a#d8! ۷sF) Eh7 X :C1{N ]Bt ǥ˪nH]-'&Z ޻ 2WaIƂ,'Y}]9󛢖]6w|'Zfg$"ʓ'G ДR /?CIouH/O?đ3Ad[D%$v>IGdଓɜ  4re P7g01Bvm~q](V4nZs&6i0)ǝyo,/&SM;3>kӵ]&]t$B]QaZ`ԡH\+oe2B8@@]mnLXUJ@$jɵAH[|~7Nw[ kOfݙ]33V-,骤 ӡ0ڋمm4W7O28>|xyD䝫ԀtW!J MndL!]Ut)Ť"\A#muy=]zr)Au&] -Fj $O thiYg~{Km<1eܧBa$.IR>mƑhIO *ΟrVB !Rt_uǩ7Ww;#0SaTޑ0Pv<=$& kAֿ8 qfC+PA9^첩-W[;@aWl;oD9 ImG5!04؟ mP H9r0ufLSYIҎk8k}ZN$nB0R(cVH]Հ &,~TYt> p|x Ap>߭+hSqv&f рH6P)/D&KוB{N/2P$ӓě'iɷ,=rGϬk6s&pobRPn?Ve*A5'-~L|$ުq<-SvSoBnj7KPM\At7MFwuloN 鴪;xoc}KƏߏl6 JTفoYQqRMd؏>fմ=5Z7K]*oۇOᠤ5IIF}Finj`Rs)'XnO^ơ9B<5t=jvEJ1l/s6G=wNwJZ W.:@70!f GߥdHܫ,;(&<K6~[`/՞djv߼%(Lf3.o7A *<&33*ʕou\Iٲl+~V 33l 7r3+0jxȰeKHsAiLՂ`Ne Fm.0[|2#bDٌ6? .j@0q;)RX:0bm|DaTSm[V 5kVVUŬ1Hc;N/׮¯D5ƩBRK1ݸMw˭88KrA&bmY1OⷅX1\Z~ ΅6dI{RS' ,vҠ<)YⱮ}&65šQ*)T~aZ:oCL?6G(rºj Z ^]ڭ:Qʅh9h.⮪q(0$uj92kIB^ě sw@İZ ~xF d)g[^#a(w+#! CY6g$\bu3E$$|7]+,N*4'(~{*&)Ѯ w/ޘ$#N ck`*'8/tdhchGn^*yh$ BBУi)VbfJ۶ /VsL4CWY55]n:䔂Dj62{pD Hv|R  b;X[0*(U(QYmN9Ȭ{Y^2c(~V[V+vtJqA ʹjds4&*[R VNJ(YY ",+n̹Mf׊SaaVFL$](ý+RSUR~3D?&Tj@Ω@eGzWd0{Jtt@@*U۽tVjw!TX:IDBZK5X\d>S ^iî/c<[[~I b꡺]zjlB&@Nzd9JL#* 'UIεb G?NWO_ys$Bs-^( 7GT*K 8qa[CX 4`2. .Oe+ CXdA6XLø9 Ia!([xm[X*loO ]p}-| @OQ 0ɴsi.{ 懞$Lf^<}2þ^ X&ב7/a rq Kn%W4_`.0_aD U/yŸ2@솞$ 4<_YsQ7O1E`iy^yv`^"T1l!(?KI wׇj@rHOxmF{*!*χ3L_RTQ'+_.ϝЪFRhR3;"q?:E .݊id0>KUߟ/abQ2T$bG3" C!F\S)_FrȈ ,dͱoP?$Kh<6 QɶFh^Pw=J{!kSК72?_SԘL ZugZ)4`V1_x<}6KʦЍa{kұuX3:KM & ۑ?ښw5ɷ҉WH"ۍ#IZ5hP|]S:KESBDﻧalo&_IAlť:KLRSpaJϱTk5@Vc rs!z?.ͱ`A/K5Ѩx3۟N{?Y$86LQz <clxlt{򇈲!=ÀqocyO-!U:!C5Dp!)_|lĂEH#Orl{js,v *T1l FʃOa7Q`~c;{pIےq#1x./ZnѽڒJJ\G8fW[oIhxnfx]15՜A@ιhsTEBZӭ_>U/k=+іu:{tp3t0GF#9D$R^ ?B Q4c¦/1paFO}#?:q@ -f J$~\l9V1Vb1FTʹë0;IO]uE*GZ4b4Eç%^ z4 e'(,(,n^؀N}^hP>XGhK'E>&nvxN .~``^g'<=ߧLV/@AOa~~@D,dv'"|p_)ǻ~/͓Ea>.Sw C*W,w3el=2pD\n7+QBPLV{\;CsΚ EUc-9C;ڀ: *'+c+mFvvYY;a_M_~x: clwșw&S9Xcn7mNr#9W٤pBctN9'Ak$Q^pN;NSL|jzNgؿ>5Y"lcWc)+ىjFmM~QgE0yh=pFJv W;>-NKK H\H$=J][r Nڲ6s]9nŷf/Oz7o~ /MPxXbr.|ּ  8?e֡M8%ukt[jç05(Xy4Ojl bGlKv0R&Y#zQOZHaa‘ HD$S"PJBf $~PZ/`8}ZEa{r_CX4 [^>qA Dcw^s |! q 8fb$燧_Oşr)߲K_@,/_>ǣeh /Φ`4y6Icdl^C_?V)tÃ`4pA_u;kCo;gW.II#紐q/Y>Ela~3}~Fײ5ǻ,|1-.@c/{OOo'ۑgv 8$B_y_:[9[_=>PkSGԭk!u{"6/EaHRg3l>y muE֨u> |}zC 8HD|ՂpaUݧq|}hX]pK2x͖myy[HE/*DNv탺oAg*p5!ֱd17^i]d&r"'p3OBNUnjc5E,/A6L 38b~9M系g``/\e,p4Ҽm2ϧ HҤm25$'Jk[qlAJh~,ˇ|Z;Tf3CO޵6ncbb]%_C1.ݶN"D*qĩL;-_Rr)Yt&"H<,wYwzߣ{wS""ᚠ %QP)ͥ* @ܾ]3cC㗋F*O%cl(Hi NI]*aQFe*8pPxy¤rgF:Ck:{s^{O Yb6(%e)&4#d%DJRDhSip) 5L34ӂԠR,g2!*p[qC\fRr*i F/{q.n#NbP'J<7r,5jԱa}n[ Y"(O*+rL#HpdYֈQ81֔gq5؍% oA" ]5d5oiGV0Sse'{{yzaaGXGxv4SN=L{y8fν p{|+ }zܗYh vsrF Jr$4c#LSxlH*r)B4Pd)3SP.S?AT$$:J"-L:Ɉ LieDLA78EL`겊QdWN(66p?QP8x3$Cْyci 1e٩q^dJH#(IԉrnHE6Df :%n,p>j*X.ddtdNh)!K(9*0n7AOBlAS5A!aڄ0NF d*XASɨTLSS|a05o%&xTG{VE y:Ĭ0v|`DW60#PpuB2 4eL$іՒ4VZ"d0#33,ҬX IS8G:g [o۹~kɌqRַ!&Xm\湥z엽M|ugV,g> Yv}~Ѯitӏ}$K5RRX1D"]׮1?>=#G9(56\*.v[tJt^zݝ].c~1wdp uGVHz>7_Px]\F|s+A5O4 qg@@z,'qZ]?TG-NfEehy]},#K ցm2S~STˏL/ 'L1%|2Lかc,\ǸRxR<`e=)YϾ ʪs6|&bSk݂SxQC.$Zl#$7seJC):qg4)$D]3İDl:6ĵGgL@no x`T{ 0hM/./JQUHƒ]ѿ=O6Z>*C&ND!A Z$\jqwխ}⏩yὉ|>]bPpuV(R?r(Yvԛ@ gROg\l0%;y{ aL ' 4fI8U D'Gl*tC&ve+dȌ\u/rh443-OrKl:]D9/|d&jgyq#yT ;6kj>UVHIdw!?!(᝸NmR$ [Ћ3C`kW;Br6`}=띤 i9P pTtLB`br*;~\0/oƨ~h <Isb4S.db &w&|\ߙ.5ݝpJ8^ zj*ArR6R) 70;FeΨVbaȘ ԨԸ&bE{bU~Qx*]3܀}k&SS/Yh۫úǥAo΄ѵ zd3Xb=.yUW%Yp-[ P@&<$KFpq$VtT65%mik[b`b/)s1bφdHnJL:Sm{S];^PB]cs*{7 {.)M|krЬz{W84f=J~tGMw5{T?cC:y0WЪXk깇kZ.?Y%wdkbdZv_;+DE`{14MSmwņ}vU{ 񐦡k9o^$K*]fjz7'ƚ@?̓.=U]1tp%Nir:^TTNޥmS 6WYufBrEF [u- `CZqPp*]lͰ*';};WĎ4GM&2Jmq?F5*;tu;}fJuO{EJP-#Qc'2r~9yҖgJ/[kF;70'dvW|&6ʚjbtTO&ʔKXS~:9겲S~jVs{s5#dDn*Ȯb\.OAL._32Ioe=ŊgZ/(HhM1E$~FxЍ M0=B`$ !{C$Э$EΚ^ &|`bXgnmE*!n0P!lHxr:Sld~&m!ʳ'#[InJB?\9U7itb& Oֈb PFa}uo+Jj݇&bV-/H91HH9<!uƌ:b)ف4!'p<8^@3A0RBKTpeOېt)t [R ݃ZΗSϣ7v# U[#<}Sn:nDI<,2N΋|?=Sr c8`ܝ4rs[!׆yK̋9CAraN{kmjâݗF \_ZMYvxcX7yIvžz=.#yT2ãP8=z GOoVbQm?+˳\$DYB2 L3#fn&yXw>?Imnh&2Uύt^-Epؒ뉢z9h3^RcmdɯK1jME '#`.}VI0h'ާe9S%mn׋{ۈ}6R~1.@;T5B p9F1`yW٫ z$@]e:c0Tn?.*`Fq K7&z~7sq#Q|a9NE3 78Mۛy(q!:. RHcݍ@9Vw[Ș*k] S,@7 91j%E4 D`Qyـ,ΛN-h:)m\^8RƄc:;lǁGs!" ׶>w 5,/1zF78ՊC!?\Jddt,`zkV`TҢ8׫afbP6ѵ_$ǂ W`w=nXXt5eHb I/}1$*c$]g,Ɣu: M<9$ p2J,ƜrDHc\XZ-U)2X12DQ} &sV9kiIP`s#ߖLYكd&+?ԅĒ :Ag }lKd@J(AqdQ|:2y=qq,MxIÑNҘ҈oqWҫVLJH0 FJbB0ane09MH(WmDqDJ w3W"A{;p |>XrN# c lM4㳼-!s0%)#~X8|cS$qИ9m5fZb8/sH=+Aꕧ\ d9ysyqfgvSB[>O xCq{f':tl:`yQkF3Q`PN9qp2vsGt/FIFmJw\'r[KnVVB}Zyu\o|1 WlfǃWC?OpyY+Q[tE/afqrhD.\/Kɢ,6ko}Kb.~ܽ~~P*PWfՇ!.y""D$\ .[N9o v.p y2=3-c(m"©rtx WŐBhP,eITJ ZK?Z04d=8L#LRalüԸMa V;7슱\+R{|/ zL`(`<_v1s # 6j |˿cv@m.zث,NSoEa(U]߼j4/d7GjwݙP8Xzf K:ވ]U#W#vh=̯e0iCe1]W"=b h\70;/0~ŵ#fOQ*5v{i2}T-5kU;_ ^O>w7mVAo`zuڸD]hG,WWͺU_oGq7|7L~4V'h{*SUmĸݥ;pw3^BY+#[vJ{Q\+9xkY!waIpR]~%}ZtԸG*zN>jJXBVpޞv>[P&3cz,^[tࣟk\[D_ZUIT沾RnQ|.tPm*m2/0^G}CJCN36}pmBIW Jp;mdGkJ ;Gqw3;Zڜr89YzsbUIV#R*A 'I {\>aׯ?#eq '3/Hr) ťMx/1nAwq}+fťK⊽Z`fχǏЕN)`nfׁ?ВRLf_8Qvം)_}" N)nk!}zadNd8~n 0Ƒ욺IZivw%]T]ßǠ@N>OVH7ЎopZLIv7%4`^z8כOz M&0Xzh#sQT!LF0JS tħ, a`CF =u?`DE$?Tj$JA{qGRXN զ!:y}dYAjMߵO Q7@sD[;ly ΤCr4x| J?!d; Z1u]cyg!7aǑB~KB0)>24WYVD*@ GiEa͏=qAY *)"94N+پցѴ%`.Ѷat%֞smKjY`Iӝ.2?8n7ZL8E[@ĺ}/sg0E#6;޹ m/ ij$V*0S` ZKLTFPl$'p#$GI-~rҼ1Aw$O여g[M>O0}p3יm^:n&+?ڼSގc}<cٱKW‘(a4'M$b(" OhDiTjx uAV#82,qJlX4iDiDi؄N#1 s(%ZtҰ5[b4IaF"𓍙MJbf&2(7`Yb88|mYL6HZIٱh&Wr*BZs8zLi$"Gn&rdY:HE ^ aE)AD,߸g>.Og.3E>3Yjb<򮁻й7T+ЗؠI;)8a```ǎN{ K lQ=8j29oo⬺Q=q;LQè4\:7a'WcJVA}.@I\a8 -[׏&~abd>jT0|j`d'n0@ޞ'%f8Zl.w6 ~oLfE@5ÖVO ])φvP\ FɾAIʭ77qD]oe}9<#Ѷ_٫Ibok!M9@eÑj cVS0LQԶ[M%jPI{4n-QE-t6y 3zuW`r^Xdߺ#H<Ǔ( Ɠ>jLu1Y9?#nE˝@7`؞g}DY\e>`Up?ci.ƌܽ"nŅ{U_AY,oS ]uWcbuʨhM=8GCqݦ?~X/UKK1 js"mZ,- w.ZU%]p85.HR0QcQ\ݡxJ)-qwfs;Gqw35K' 1jl:ra:<o[ڋPP?;5u"!'I2PSN(Rlo<} }6 :RZ[$V(e6UhDX QJR&*BV9]'Ofl1f_K*_~e& СLHzuyȬաa3iLB$=LR+hNq =%JXwHH`*` e! 1g5g&S"1J4t$Mհʙ`Y8+StJBĘ ]fP8%\xmٮbS+HlIƉК2h Q(A@=1K[0( ZcA3{k ^;P[P '%:6TTknCO6bCjҎ px͌twЌji]Z~1L5aF!Zv["/ʬaEc̄Vt>>݈Aոh53!3:fh`v-3v=R`t#f8k/;nQ>j<ȃUW,JTbJO VM)%2"qynxd`&p(6A]XP$52 Vg'/1PiZk 38%.q` 2X_j(:q"Dĺ_/WCLUἔdY/VJ}vsÃLfu ԬՁP@T@1*=1FAڔB>t.Rv"D^5(N~мK+ɛۭ/z5#/' {WT-q5|Ij~M`Ea.e{,e0VoOΎS_:n^3Iy'; %:ؽa6!%ϑ`DrٱM2$ANN !@'yVаk6 a2PKI Rk9>j5:1O[Iiud fb .[@Q6E $ΫPd`v!ʅ Ctǎpj=4Kn9Bp2yDʦ]$u)A *f%3fbɣNvmY6K6EYz#Q6n%g@.PDOxDԹ!dvC >J5BgvK0AT; $ٻ\LI( l6X9c"Xx9, ]%V K-Ys_FtMqeîMDOri4+ ɾe4%?wаo(g2TîYq ?Pi#/߾":`Cx~=kݕ?ʯ&7_TߍqkQMKMS,$d|_)?Wyy.xH98"?4z$e ,!I' +j.sS((-@,! rSRedX^縒M`oD,nPdd:o!-DJ". ,!r{#)slr2I퍔 ݥ {rAN!VtJ 3zydc-d/u/5.u|z(oW7|F݇SMޡLN;hU4y-KS!1:"y{u"ӊw< bYEbm]L8\HzK?&.Qm)eF? 8lԨvXGaw˻ = TRj14Y& wn !HO~܇`s[%.ů-'8jf@nBuo Ef;1i)MERJL=u'XN@P{E"j"Y7.U /qV t柆Ju|Jo(̸ٓ 9ZJGzl1̈7~za] wЧsp_3OGqRO F; >|8uŃ:n͂s?ӓr5kŻ[WaN~{_ CəWI50D |³c!g1W5ZzƬSј#z82#z82tGc2$B4>[VIsJ6I2 mjAUGci4f7ffp4֡R7;WO]Ro~ˢ|=(LDE' vR% =F)d}(C%bVuWǔֈĸQ>_6G "-(t@3ZF D7,b B'2(]R k1} JMM9bъ2"G>4FwNqIb3*xt !j¢mA(%QdrC\ SxCN7-Axo )2) IV[M(lQdsd 59\ (mb32[| ("RBʒ2DRZb\r`HuU4$^iaB-FZzIt^gZbq5 0cWY-]U:-q¯1FiI9rO XpYZ\B8UDF8]ch,yϣ' #HH%ף24V8pX"Β3C&؟gJueY6wup׌p?RV$=K{![m LD;#GYcZa>aI 7:\hJIVJd7>v+ӣ^_G?ed79gO4̢ Ycw3}wr`PMN +L԰lEH:|\>ReUbJQh|#˨k.PlM½YA?S` \bCem`b}l`y-W>D3`\ G+![ dsMb%( CEŢ ^I'U,,%Fi6tʕP=YA\|Ip݌&6N$[`dJerв6rMK\֨=Xb seBFZH(:<0]Z`<oANUcV{#D {3}9dq`;;r7]A׈#' ^X*_,nM:(ׇ݁\yCOx)*`rbvutq&lPI)A9&4ζ=C +M"a"ʀKVn*;X*}`6n;zdS\fM D][o#7+^9H˼_ )3Y'IeH6V"K:R˓ [ԺYlŊW1cITU_H :hMi1~:Ú GzXb,7֚\5<}P:lhkp6 A% >X*ox{S kNIA-]zeZw7UX@{ +Ӧ!`i;`GYr5*Ӛj[3,+GN*ńO)Fy\PV =A!ttUƏ횁b <,1(s I%8lovb 8U]sv­ɮ`bL ]>{p>lLH3T&E"ݳJv8O0\2]GHl}$7?P3݇Q|sb<a;EO.&l1{p Uc;čvyiuMB>7<ٶF;cx{4*La̋L'[6CN& .!N\"D64 F%Ҩ.|X= g9vvkȐ323-f7YE~{y>D[%YN^:4^EwC%<̇bb$,] &$7Q BCA^V] HX p@(%hg\p$D2'L0Dk*%ޘiQJv~+d]@S L9SvF#FWIbǚXٟ))xYo&r'uWڠFPwfdrJɥʄf!A7$\ZCr er4"h0\썶ЗO(mu0l=܀Baź}7@'%p,N=,2g 2fVT2mQ]>K[T4 2$FuP *GjbX;+&qC3BeȃCC"lM{m3FzL-'۝;`òA0m?O~= C6x̥RT GbB!w$c$Ecά"` G#q({gn}1YNU'U(!{2( 5q`Ɗ5J`39PuĕǏ' +ETRzspMJ6qͿa-kkwcrVӌb>.ҦEyЋfjO%X"F;25ϽdS2jp&í fhT%>dF3y2"[!X&lu?M>6dǀ/r?[Ի%> lϵk\/oݾ]z?#-a-e-es/qwӲWIaqa p5TB;NVV&<>燣IZt7s;4坁u:La 0nݹDKbJbGmDuGըsz %EMgt$!Z_B*_b{.]ZA VYBόW.j9XrMu#WoP=EƂ&붟{&uSg $SPe?2)=dvf 1-BFhLyMkBQE*.iR*Kި%G&ad_mH䱷:5N?e`B1ɿ;x&Q kUZ~cA>A9RJYnF>#݆?/3/gRl6[ctrVwF0a Y%4oYҳ Y}Xņ$?\Dϖ)`"@$G*cXRō+)_eѿ=/]sE9a`mFPV$N2ulsԔ7aKݚ@|x?~'% ?}.cC9;"*d*gΟDqLYͳw+3Nx_]9 s7ʗ_1ߙذHJx݊iyWiZp&Ρb/II qաJ :jk Ɛ܊h[#DwC. gI^_9CƼMcTh32O$"RXwIoc5$ DSG2Fa9)fr^bKUB@hFLxNŊKiqڷ `N󏭣;FSJ$*uDbV<׺a:W!Z3HWa4骪WSJ*v{;D;Wߠ g'hb'pEZg, Ԋ%J q)N":DȻFބ@Ŵ7MtX9%;p 7y–Xn8Rik FJBSAشƘ]]&`DV4c l$|5HMiHf71LD7SL,F*w&47RQA2.!GX$XFr(BRV9̱4UY@$nۇxQ =6Xa(GqfdLL3RI0)G ̴(ˤ--HDBkNƓ$XtV5} n\"b Kݮ.`pwU53DVq*}T 1غ|o]嚪X>ۣQdV<&t_=Qw&gy54Yup̖wڠMC:|W@݇_2ˁM36VL / oѼa~%3>N|X\rx=x O9g>Dp4Km"hfJdOA^! gl6,G`%Qw''Gr^-V{QFC'* CtcVXj9v_00T6ԧg^hz.ЙIENr7trYo{Cit vaq{wϝfkLJ$0G֝CyxP`FWXj=$͇B *qkoio9kfDM js^تU`]9*N޼QPl6[:>itt*4oN׶+SZuU )׎9vϯ 0i_L<Wb|S̄14S7B´K n-B̦Q:[S/Urӹ!,ju5MkG-Ur~3j:Yz4/:)ŮoVU^Vd9|w-sIuPn8ې{c)>n72D7 O'~^f_$*xNG?]x̝ 4UPoIғ2lC.˔Vv+ #ՙ[v|=ț2 qKAbp:eQvM6KiȚ>]h"e8ϋ#y̫;;Nrn&V1S蝖W!n{ <q񶍇{aPI܊f:4&t 9ݼ˳%n\KkSRʚ<8 rVdlD2KiI:], pYfj Sf8b g}jx&lxXncbl#/)BMMaeCWqX]Xl I | Po*//Z&:i:!ue(6kUԄrN²`%l}Fm%@ 3!NluV)HV`leN}۬Lج 0|29-[a9!W:.BNș#ҬvSM1Ğĺq~X"kV1;B qיA"H-GFWpiq-$}X"0۪hv=~Xs%7l;=;*xDAKʄ/osn1+7Gx| YѰ)]=0=H3dιz;nlR{dX菌[ԯO\AF_{NZr*TLk?~Onr*cK\w߻*|@zù ;IF8Ehf|<] ewW^pZF 0LJ}om'^E[+ڑgxw}oc<;.Zd=_cv;`YF홃G=uʤzB\ֿS \uC[Hҧ.'j ]y.U*!a6xezƍՖ2 16>${ z2I8F(-{G|v"q.{IkdAc1_ qdBpz_–f.:aB5H)"V&ŀNx"f0VHSU^7^ ּdž{Hjgv5K=ȷJrn],qZn殳9ykN^u6n/7d kv{ܹ;giu#Ɣڶk6ŵ)Zsaʴ< \b=}s</c,:G2&PmCN:9!&&?{p3y_!Nʁ잎=\I&M[e|0+P_[lͼ%.͎5[jA%KU _}W/罣ǽSoO# dU@&78 mphV_w'gntw7i; *Hmp8=u?o51÷[G֨+i5gfwxs=.w4$6 |(|-4Vm^_0"kqpd=Tc$#,/FPY+[f4V(ջ[R2!`ȩ@(]wvqB+ Q5k(S.(, ͚1 ]eؕw]+퐕XyǻdDlS1\禴FHƵו.㿑 Nkz,NU>w_c1Y|4h{4=wZ2m6Яm?ڳe-WYb,:M|t2%&ƣ7=3<(f]g3-Sc`~lԻr\SmMPF~ngj8`1,}#4}Zd1 }3 ȶPs,P;=|~xK ?a}=Ed.ga2WPecM0 fݽ嵵m+E{ {LVY n99W?2Z/r3hrϓ2FuY̒P)59gրI1BBf)TfG-keWX'.WhV/|syo}7έkcwn_bG)^ V~4}q eK=%wGmv򸣍[t q.Z X& !CjH{KS9㬵XùّǜJ aX4iMռY}(tl0۳2h!t<$#3?Lkhc/K/ۥ&f<\Ў92h.2R cRxp?G)7ݴK\h4|P<BY8HH+c2s@2jp`q1,]Ŵ˲v^LkKG'äÜ2z='LM6J(!У!sY:|RsY:jl8Jaa7Z؄h;Y5Z؀ 5@;Olr;Riô * 9oЄe9z&-Y *Li&njw*N..ڔ*D5g|r"\Ĵ.z*LV`l̴-ev1iJ^F!1M(R A 5,6eRk6O"+PFcP܁J`6DI/ @<xRٻX́y%KS0Z{&r X*TuD~3QOJHȝݸpEd4է&tC:sMB}in"G]%Bgoa|)arq4j2??;BRr{KO賫䆓&~=g)L~!/E͇ql5 kZ~?%/OٻEd.NXw{%d_bs#9FLOd_\)׋F+<i0$N9s ,;xhfȯl3,̸ՐΥgRNC ,{oInFQPd13[6+,V__sN?C' )JWsF BW&L#بc|CQT)s'̌MIߥȍ w&"mSPq(hL3GǜPIDBKC 4Vn_&6\C߈땜9e)4@Ps)x (QB|pS@GYג켤\GYHa 3=P$ 7RzPSZg .IoW/$ZM" A9+dev>_??z_5iakA|'YC_GC#WO()2f7d1{~n/NaiiJ:}s2ףdαCJV|g!şS G9MD`Ɠ?o"^8d@7P::/>S$h QUZgƤiRrYh'_?rg"MT`ZՓS^hnG=esZ8 Z-+1֕LT@zd.8eRYB4hmddQHVi5LPjS-{v9! [[/9QZzF K1rGn P"#[M&]2#83U$c/Rxؙ}yB(ܷPSJ&BMAX)ņRXlʈÒ3S.J[%H3m.خz[*`RhA)|=_-5/QA?Ŋ1,~-֍O"hv+֊o"~@Z!*hr|m*~shW=ˎ'Oέw<o?|C y#,w6˥J1bmp;!/ ]3Wg߽ C޻ƂuA1R6Dn~IO^ aEƨ<+ϸ3 w 3Fٻ6%WzY{J}0YHNuJeR!)9[CR&9Ӝ2/D/U S^eECZ>LƗHԥ̏x#{,RcX1w"ТDbe RulnpqIZKU+r1#0bE0k J$σN5K]H0P͉)ҺbJ!4UƮipdd/T{n pc}(:>˵4wM0ͮHϲO}8\T&ggӭ7#3΅迍cy? 1fz>,ouׅMmb ދ2Y9}P,% X-(XﰵE1e5刜J+SNϰ(x>^Y$RajTHxlXdpr >_Lk-p"+ZJAv,\oAӮ,v%P>E-!ojr{eV `fJx|^O |/n5 w5Gr}W\֥pF!WCCuC&n}~ [jUb<8u)lȦV f4S,I[ IaO'ś[ *u^{7}P7jHoA2=yAbA3hj2弎wbEN~}W1F閉nԿff@&:%سמ Iɚe@{ fV+s'w7O)VfIteT;#ݥm51L$1g`Yy5꺷V/n3pOyKiE˖<9򊵮?MhfRQ.ycq_g =XyU0-}mjnJq[_qSq+yK֛MӔyA3tY# m(JBfȗ𖍬+mTf*$C|*TXknެ >u NvE`Ξ11 D<)Qp<YP4+gloK-v%oLk7 7ǔZ%L{!Eݣ[hrL\Si>#3)񘬒ƱgOqsZz*쟕<9XՓ~"}rST[wA3+'E({t?!$@Dt[9;n KIS)K;w-ѶMM͹;,4GY:uB&p9Jea;a$Nw 3FOH#?iTA:]$BrT8RzL]BH8EMț4U֥*)GNvyI]1wIHvP RyC]ƑЈ&ޜҋ Ir"dvC 7ꁾ[ *~!N}Vgl]սy<{@Me5}`S],w;;`uO"[)๬gwvO R-,< H}w]f$FY,t1=~Ju!U!B}c7N9cgZ`]/ʦq7 o.,ٻ ބу?())),R<cÞQ ;ne)# YƝ: 9J1q|4w#ChT/̪6*+8ލ3XfYBi::ʍH/)ԸTi^ًWh৲,Wy`P# 8:HpXdV`3FCpo^e hkTy0'Mxeo Tcu۳Wp,Ax~oWgoǕw0{?xVPu3—ff٭s w!>7 "`\3|,b(:ΪTmr4LrrE+J eUcL[ aYdJK~DoA)i=BXI%яcP̜ҴBBq@Py!Af1("Ar"gqQuYe `HbZ 8GHԥĂ odEJ{ k3ƚ6jUHL/H5D"F64Zw߿Y6 0~{Q}-b_?h_%Rt2%R&Fb8&#q@*0g!#k]0Fay4H(;($( ʭuypFkV2830+dL^K99ג`*DY@)QYj@ǒ`c0)VHzXEԄ#Xpn5G/s\; Ip#c k9HP!6re*TD5bAN00U$/ҴBHspTp{`FG) IN̤`@_a0*ĸk-;І}(0{|S0.:x # G{Q"kTMXmsAS."?0 CpGkQ,.\-e"wQ/bfa?h:ù$ż陳PK}Y^͖~J)'z0VLD{ ~m9gTHb1h7Bh@肢AHD0G@uP6ǽTS8Gz9oSs^D(b$3XcDcʫ/ES¹B,BuH H2&r:GA OB. ?(h98MQ^e#4h 0&<4J 5ks* ."d)qnۛom]9n}9_V1{7`-ݧJ'xH* DP^No-LG#oٛ3=h)ڼk]6GzFJ=Kj(O@]Cuq}a{x$F.^s+D*T\G >^UTQQk}.n$G`? I`Ï=pL"۞}"$aб%d8* CH$!)}Oc^aa 2W ǜx{Oz[cªҘH'P1miSəpLOrlA=|#J2rƵ J廻s0wQ*<1*-BQeY*Dc+7 fd/Ɨ%eo2s%z<7Xn?Ym5e޻0n(ߧvp?U6ؿ?aMboet|D2Ծ&DW~7;MZC_eTWgUdq JUU?wYRD )h,4Y "> RNH!ڷUaq߁ńW-x@O ):z 'gMM(VծͶ5e9) r#pN]k=̍NyxSl #H#=3j5:f , -vy:9H*ݜ@h}QPKkU=.R$+JM;G!*gdX" φg'`||/'og eƋzF&,8}: Zr#A<ӕ`5[Nb! :KBYFuFUSeHb[QNi) PIb)$H"T\Q:"ຫNJbgǩCNvR5'v^{1;{Ml(,Bp*&SKkA&GF `hADfQ2&DE̽TgM! $|G٠hP;|1uxNS3eTS]2hHG ‚VOn8Hpq^qDj+3r-w\i0wfA܆ (+](J;QO)QJ=F&q, ulgJXj0uz$~zR߼cbu;3 i(PҺ:IMC҃irziSDOe\[&#~a%未YmbU8F7qNߜR}7>v81 7׽'&WO?du2f1Q 7 _ڀ ",Z:9u Ccy 5) PaTEugg'GO9w&H}fP-dw{Zs@倏t_Н=c;5j&W0AND5Rz:77X|[MwmǓ,aIƓsqvrŎO6jRRJ)ȭ%^ɾQKEjG('(+2ģdqɓw $h`D-=H"LF%^-j^MeRV_N&95w̳k6}yeW77[2S.DՁxh\*M%BikrYϻTߐ^EůiLHp̈́l&g3j&GHZVT1e("JPhTq]D09e\y=Kߐ? /-4 y] /zX 7TaffӴ|goz+/-Ah'PKK8K߆TsH5;!jHҹ6#/M`AG*xH ?E) 2łTtm_Ԋ۾, pFM\%KTRmV-Fu,vKH:d[N\CD4uh!r "8&>S\^d1}ܪ%`UC$id W;tAϡP[/Q-< `uG_DRH6P^ڌiT;l I= %Tmjh$DlikiLLAP-ȫpuӼrƘHr^Rn8QB'i<^}8 IJ6_3F 4?ടJP 6"OjinHQ3sOr{BakSy|}8.DFf1vØJ 4Vخf$5S:h;&Ƶq\R&jfX5%gE>F3k*>m>EuQfLfI8sAL3L :4dCt&lD B/hYf,Ю)\z=]5~ {\"X>g8K%)G̎%5UQUk@\Ap :à A?T˨lq@o1 jD#2`BQSzGr0Q0V̊;Tcq XawB+.q+7WU| Z, /G\4g6eofm=-@ SLA1B۴p-ڰ.eUe|"wEu|y4=)Xk ɔFH̐RK8c+|gwto'y z jdž5F-FzPb{‰=|`SbLWRBU!Z3lʵWdZj*}?#_FyaV}R P_iC}0nYro-d=ZH{oKqkQs s*"Ɂ7>ЄxeDCjÊ~HFzG<zZޢglTJԷA 8w!Eɽ%'T ӖZ 5=U!J{jV{36Ku-yeـo{ سW34rU[?W1-\Л79s #}?.12?G7J=Q3F۷\15N@])U2h*\[3{N9CP)J:q(b M,JDzX1*40Cq|MrB_^(6/$d((]|\|·AD֮X2<<*1jKԂse˛iKzMW)_oLhxsoSK}闓׿jX&-ͫ`TD0:ӯEO *GWh]!煕 @8ِ̮b?Iyy~o7 d3ez$:qq{Tc9 M?FktSyǑˤgGHOFAEQY]*U) lG<YJtA2@geTL?]&7n G}|PgIAU%Qj;y-K Rq E_f:|Y!M:|\MNTi!8]+B47(pN'4٬r4UƠHR;<9;,編ָ=7q g˜mԏg=쥔;*`9 tt CsÅ.rw'ls$03d{VHjM >Kv67!" oq<{g Ӛ zjŋ{ʄYz?? >قHPs"mNx TMQU vC{/N8S|poSG\7+7>1[ C*ov|(Rj\)ls螘a DMN:[K=*aXk>+ нP)?u?^LaEfRweyUWsNƀVt3'*̀DKpy}Lft3}ձfuꛊ$9=k_:_g: ~WRbѭ-K}%Fi =Q8H' w T~|,7wO0 N:Nh3Z8P'XH톺Ѳ9!{Jt37.rf}D Ӎw ǯ.bl=bxQyhȣ3{WN~a'`6잮$&9h *Ei~S~Rŧ.~~6(ӪY|3_~+(oOqO3ҁygqsV҈CGeO* *ΐ KBPLEZ{r@``r*w0miJc C.0P,-lRyZpbʜjD4C>^lr;S-H:#j5 ȶ(Cq;~ƧQsdH- ۗ?Ε&<(x7]hm p*%@m `5uHpXrk96˜f ]HhMS> jg(8n͂F7C./[G-gS9xPu!B`Z3 hRHĠ1hRojYrT"J+B+iF9fKכSSi`QRnjSĐR4R5"mK#$Dr7Ofɠׄx :Xx{Sp9㖟X%K3V>ZeekĖ Kx'k\~zpwt 5h^|]Y hyRA{pƌ|cf] Jg'&dAL=P'Gȁ=? ٗq6zgK Tpv]ڥ!ޡf3Nv[^xΘҭ;~Ԍ0{#:))1BGJxǏZ Fqn]Q^}}&5r_Yw U5dHGk4i5MϥXX<^y~f3T-.,h~e~kwO~6B]\կǥ!oc*6¼ś.j‘K[hwk|O%s׳RހRZ;-_H} ˍ# y"Fx[ݣ$A:vkAv;%n/'j.$䝋h+[JёjN>hgSAnk0'j.$䝋L"#L@OKydnCq0{.?>8w8 U>{2zIW(7&OHR)v"(z0L{G* 0 R؂zDhC Aeݣ׻׮Ű 5{^{" 6+C2 sگA,2I b2KϜ-7ϵ_/x[BYyܛob!f̭w_q/=}oS/Lܴq\H.ʹ)y\beRr=5W. I57{sF-:$ w 9@8d> q>+{vV1)$%5ayN !hP\hRe6/$0Z9PpwHQ+ DTXȥ$4j/Q`ՈR^8kPM8g-e IӖBj?[x_Ȱ/D!0f Ok&S{`a",RBZB].("  o9u9\i5W^~Dcz$N+f&x/b܈XB !:T-cq[xX.m#!yEAl1 NDZO WRK|0ߎ-R8T)EA4/STZY}-ʈ, |79a`M]Uƒ~$2$QI3(1\xION.F'ɥ1ʸ[}.u*T7zj:u _ڧC\m3O;?m4I5Po3c^;h=6b8s_L y: +_HÝFZѻH y(ࠆ'rOmPg%t\~ǩ/-6SBO~qoO+N*x.+[j*:G {AyH1!;??YJZGǼ:~_ZfZ> /}â5Lb,P?Ӆ_Cܧ{.U@e[qEcYFii p˳=}<<rd($*P FIuϯ6G~懡H(BH77~b}9GK)Y|E"w1A2 /L>h0*:2pGTX Q+eSexnC Lf/@>hMtI {Jr0"KJ{-]NM] Duj?4T?s&bwRi96U8 Q^OpPqƠK8Lzsb M&T{%mr uuD`sһݸy&I}[xQmtcIO&UH!nNw|ut+iDƍF+$ hvW}}+utun J)bF!?܆|, tᄗ"{۵o#oZWrYއslε_;2=_ #"Eˁ>]CyQt9sa"}@oI()鶀[SyP38Yd) @飾P[@BwI[@܈8E n k(gb*3EPDg{aUJ#,BȰ-nٗ[13/dzl[F8zuv~|Ŝik.MPwq(QӣI:↰-8`>ZTQWюQ? ])NmlA[}HdEC:%U%A+?P gmv-^wQRb88㇯55LiFEL>pW@Ya3q<Y]7qó?}VVoJ/;Wo92>w_'!I?(O"S"~,zO ק(.ҍ$qDC}-c5di?Ľ!KKPh._OFv[CvupK$lcϟ#{څ7%M@7ǎ8748Pɳ g\JZqm%:ݠ(AdcN WPSU_m7gC+L`tux8yτ1!az^}`r _2+F_ݎZ) w57ќ]0 3Ȗыk'FF HրALg?ͣȆ`9|RR{*-1#RˣE%ͻ;)L)wDZ{{NZtBXl#-#5!bNg^3!^I~JҲ*Vaě@bY c˱kӊ|E%uU hMg TZ4X)EaDM-¥dss|n T55@9d}[ 'on_/ `%LJ2vt~{˷b{E>}!qbHܧjn2Wxk FFӜ"En؀Z Iz )ՅŗOgs ȿ&g3oj/?'F '? eonQ9Y\J_oדߺ󣗛j6jlfFmOFmOU*yK(Hu0 85A1_ G3\#1J\or̓mGm5PPFCmNM0* FhU&)_ lnu)4EIC/b3$n0LvJk J ptWM#*3wܬ|AMy&-dBI,dH>2z0PӬtݦFW;:$M|( 7:]DfZsuB9r8X@䂁N b*8@.~1)~@GͰgZ6BK{}? CEq{N.c54=~gIɢ(Z)%z:a{T~SQ]̈́tUQ(L„Ж\sqlȔ6V Y¼+?UPiBPl16b2&ja92ASyb`sVFIK*-*T`#BLkĨxnM9sQJ0\H2.A:/"F84qma(]XPljݕΝ^,*WL_|)?n%*׊#w+}XRLT0۽0\cF 6rpvLq4y^~[s[~kKh9d.7"/9р`FpaCW]fE< \mxXG&˹"O~41_nj" !ȩ4c%p,i3;;_?oKIM>d*^ͳZtw)]WN-|^ GsI \g\T1\d 0@#uiyC%|HdQǿ (icO@.?=Kpn D^?]Mth3ZG"Jhѝ%vxhy!g Ƭ5 Q5u?d[jaz%E56T't g'՝M[oKpf2qFKro!Gp ( xeA6C[8Z“ugFm1KcSOS!΃x;{^>9 wp *J*ԙQq%_{ FmwPmv+Sh hۄlµK;f4"#* SSQ4󱁃DskĂC ?|Z9kfWʜ[Ĥ/Ro=|t[hm8N@ojQU$nLXJuM\8Bq9iUpp$!يPOP sY_{̍Zi12>x̮$%姇G$gR;99\ߐТlr/v3_iqzM6L;zI$lCK!Qr1@L>Y7-nj k`\)"͵L9%rOG3}?vҙ )ϟgq.eS@/qm!X+1f74- _\sG\x#j N&FpBLTGM$5nE&&\=ae իPWgIJ#x~0Ld" =ͳŗ>z%4aٚ>K0~I pw*kR>! !# 6Bb*#lG*UPz;.&=uE;wKI}|XJ^TqgTҙuѷRP4~EvE=ZD|ELa3i,þ(\Blz*{xȉOF,an p"Lyl)R %$ USlQ1Jv3>*M8yQ`Y"̹:D0d[D%fa68RQ !sŀ8A5wm.d:m;qVH,>뮐  ($Xʐጇ`?ItaDD5@?l Hfyٴ'z@ֹßd `FJ)|fb L ltbqmU,YPB%%Vz1V-fF)s)ɵm3߳-3zU\'БKYFVWGsPDgݬN`9Gu}GՅ#^}mEzDuhNnn=9 ˧HsJ !/TUP i(im+Edg9e9}kS{xtr"oy+#K[PkS& ob,,үƞ Ze6CF?2_؇0 UeUhїbd*s},@*K(j:R_ "gDYQKϥמ |K\ߣ>zrLwCk[ ꗓ5V&% j@]<*mp]?Un3ߴijE9MY35/;,WqYgޔ&֩?L `֍& cb-t Yǖ^r9;y9]#ٱYX tW^+sgw8Ɩrîh_q`!81"@;&yy. UAH/wGݵlZR}*oQ%E&[%DHy֨lbL;wbpyЅfmL#0.ڱ%j)I.SY* w6Q9BӾBH+M W}|FFҹ]Т5i)@s[`r4/r1 Sڐpu z]5@sDp;g6'wVƞd9 zϋ{Aͧ{`GB.>[-^M,3>` JI3rߏCye7Y Ӵr%n+!zx~H""eE횼] s<{0h̛ I2ԒOEg([m˹lP޶m`L_瑘 t!_>LgR/Hw|,ǵzj,W[XS5h=[^K ;I[2cTAG"bXr]1 BL"tLL0uW>@7_j ss.U͋!I@z ^4-dLbx$=ܫS(t<Ӵ>xeS)c b$R-\HAsLŞr\B'H{8:jo\5 RJp+GGhhd~чקswtΝݹӹjx6u"b ߄FP(0T`IQ$,1D@t$1! VKJku~Ho*K :X&+5߬nr᦬9bfm÷kW}q}XZZ `)_u|x2ep3;OH e߿*}4Om"JWϭ'~N.qqb\~ P 9@_+ս G{2p,iDڜl ?`su)K $G4Hnͪަ[_~X榷XH)J"#∄}d3H?4?i;eU#&ΗEv Pi8֪D=e '+ly:)Yd]2^bxR 0cjpjaºsGg&Ɖ܎H Ztz vv%Iޝ;'ywy˻]eaiyB)jEbhB)YEq8L!c%{]u_TPOm].C9\uAq'l@f>_oj"׳(Q,:'N2a%" F6v ⁦,ƘHQ=h(lAz.߮"? __@W۽]F~%o7 VFɍf,|BeĠ1fOġ0 )6&D4$ڌ m֪F[U(_ C"owrM"< v=uCmk+g3oE٥@iom%H"S`(&©"VV(O`**m٘R`~{}Yե*t؊h1ϰ KI/2e٠P A?)Kz֤{CbV)}Xb% C!l+WGI~?\ < wR0,811g;!Z)Ap|\0 g͈ t Pyh ;c;``/=auP]LRU>d}m.tί\L7=ǂb/.bW}Q:Sq l*1}ŀ%Yk9y<\5NfևOj0+^'O]jH 6 vSϝ֫-s.nj)rK8ӹX:#>w`A#0+ϧ_{df9CV xxZ-?U6Y"5!>%T+ܭ8$ x1ڡ#{ΪDWޘ(C=:yڞbOۯ\{ExE8Zk.$OM| T`>+vtcȵq/MR1n11 e::}@0x:K ١31RxNy bMTO7Qs v,ê[>ޭxŭ1hwq,-);jdV]*}o7fuΟa \}:4[lcmQi]?uk|Wm"W_n:gq"on5 4$f*V\NjeIWAʹD1tV,|!R$a JV$wuɢȫ&-ԤUfkJwdWEu4A.>R6=yiF5>d}S)| g\AYia:")Hmd$ġ¯_>>V[~Ec@ Տ,?҃0t6c)Qd;b8Gw~Qֽo˃NVKn9jF׽EIOz0wK]{}8mI)ԦK4}z[rpl=ei[kavOr)kl'9 $Ȱb2m*䵷ɪf'c֜3&wd96 q7<ƻ§WVdwJ6@ZB:!|0u֖mU3aaH x8|]~[*^_If&ϧ!}9D??׷!>+!>k#՘ 9KA$FhsdT >д*D^ӄd#4YbZzJWNjHKFl$5m6s4{<Mc&kg9\AI ν*o706x\NpZm.?,G:2&Jmwy 7F oO۳/֞j{ĚX]uX-DH_Zj 3u;ۡN4hȱьf@ ?O;07謰Š +:k9d@T%!K#gQ+7OĞbɜd'JUbXmu9u YlH|Rī#]8aKjdFفԈ{e_Mc-D8MzG`  0*Y󌤜ĜդDY ,Lr=cŰU@yJєsأ<UTa Ga+DMje`FdFdƙJhBi'NG!ZaFq /mcpvIBL*yPِ%cR*!R:y1S1;a@A5/(+5E PBh/ji|?RP*nE"BYvb0n)H:J1!US參k4C]r^]\|~/\W7p6<-+%6i1{ ǘa.۪eSb 9FiҘ0vX`㖁ss܇RwyVyUI8_+HXU䈴8`7&i4}z}jmFP'yH9T<C` Ja߁j2IV:xFYHߎŧ4OxkK?]O|C7r811+*_XZ9޼-=U;⩂∭Xo nvȳ^-kN'gEyrVYS*iX9I2hslI̢  m@n !&DQ8q;N-{S3q@mRZBŨ ݥdQxP41s422%F:!HLЧ9tkq-ű%h<`J`4Ñ>ґF3ü\ L1Ó@HD{kG;,vC 4Oɑ{Bcu^׌K&Xr ;ٓJ 1fmLӹ[`On#Ћtv'y Ȋytz](ѻ>j_ّ_h<1ˌ0`,d$n E(Tnqv뒪?/Cv0zt!"1* ʥdI&-7f鎡2.kdXl>_Q5)ʉf۲%ZNFf{-07[bk&[: -I!;)ֆ/!6 j6A 1.bq h}B?|"T-R,n)d1I[ Y甼0,u)_߅m7["l0QJ}dwG[78~#%xI(&QJW=JLg5`鋽7Ae^ T)n Tѫ4fMs6*nǵВ& TyʯtWez!#z,Gh'Vt!v+ Ӆ v 0TB !C/N!@]IFd IrmB ^:aC6} ݅KWݪf 6Nժ3ԉQmlIZƣA5~_)PrJ0bL"WL)x "W?Õݚ2mg}>h[Csmkho>Qȕz0W{gS" =RrޒV=߁ Iϗs崌anˏ@W:}M< [7{O IQd\;wNۗ%C_joaNgRXY!d' 2HL%ᝏYO:8':ׄN \io}L 6]=Uj-*p)1fDdc䛫25*z/Jv7aQ9ī_q5kdPj.*{Ϗ1ū,4K_U8Ve^؛L?[Y,^/{ޔ s-fuOz^8TԓҧJa\{DДnԢ|>u7d#rԐN9pGK!kQn7ZD;T+S6OuŠtц=Hg `ܭFk[4e9:>Gp<\"a@5Q 0k~ETR_H+{k g wẐ_Γ+ũZf_u~yy>/S,TQ15B8? Fh!)NE:ĝĨ25HFBVP+z:B=V^@4&9OfItdhq.DK2]R$Җ'i#!QCF1ǖuɉG)Ƨct[r v΁f.p<ό9J%3z$T@3 Jt gS0.r$#^WBvLb5'ᣕ.KdT6$,24){6[iD U1o`*1S%XuƔ!kpgyLѐX&8 $ &?{ƍ_aVi<Q.{q%/I0 -i%d+}CJI ̃aE kv ;Y4( vǑA8o$s],>GR-<ߑV Gc4Ԅs<3hh=eʟ'ly6];ѿ;hhA >?<ֹ۟~ģxw =lQp4l/5bJcLIGa%YoÒfY' An~krK+L>lMplI@Cx_-۔pcP_ZW#] Y]uUҁ>O^ /WO޾‚NXhDjgoS6wvn4.ָZ !Ӣ(9M-DB;bXlem5慁r]QӴl4ފyY{HIKj)/<#zj o:(E9=R%2K"dJă s\gv XSe))# Bjo)8 T",0 Yf#CY(c6foFQ| TZJxKBZ.,/EU0ni,%(¸ae~dx!L5F<[^H}l)c!y^ IMlZ`m Lzഒ^t CBL!prD*¢ Lž0 I F{94IikXvϚ:L'eBjR.wN 1ƣ;]ɥ ||Jå%hU9%h[4 }Y;kZ6u3%MPF4Gx5r9oRGc,-ˬc3Q{aS Yd eϒ(ܲ 7#&hvp|x >> YM80CW4<<[TF%A"]?=:WP7@GZ /ύ6H &vuiMQܠAT:,ĉ,eeXL6/y0ΜF-xLashHhh$W-*{Uʔq4"m+”Zȣ!-f}wVcP<{fw5Qt{)FeBaeE\ $1O0Y$jjxC@RNǺvqƫsi }GJz: Ѭ[6\ad1K'%L.KLcdԻМD3>(99[z)j# uLaǦQҟԊ4/kSO aᔾwUc?sF ҶmD粅H:^; ),XW{3#skXŵVPK>4#e#9?tӜl![)1uVA|&c-3"c|g{ZI6R < 5eGe@||F!F~ʿQHY/{oq L]qsgpe$"' FEN}_R$"XQ&1)DKWV(BVVE)` @h}i$hz(>WM?2,rA 2Վ)hA ޑ xAI6FEy2W7} Bko:ݠ\]ïn.࿩nk.$BH)Oi0)JXE S-IɜÍAzkҔ;n' M̆A5T,Kˉ6 D9?ZPײPK*% }BC4JD-mQl=)B8}u5:4O YI.ZH>ʝ(!v  Co {dm޴Ɗ3XQjyoODda!(Nx=uGg~0zQQL#vAU=r|~o`áV$kVb?;u;ڨđ,8n/ 3J|%g/S>b:&ͺRe?>oܶ:B}Vت:fҸ=|!p~qy+-d*?+NIUH0E%s[529`h3+IF}YPcYf '½G wLI 7rjT.G˾bf(*"z,:Zݯ 5"d`5d򺸍?]9.lL37s]ME{ۼ~׭1<5\j)F*Oxܔ 8sQij荤ʰc5:@a?U1RzP8^Vh\,[/dI3B\_œנ&;?%yؾ{Ѽzw3DeL]q莂fQ2>n{؛U83x&EP(LU~CjXm:PSz/0;fi5bbtpL T \u9c@y'@e,RzrGPʿhT.h\|b8)pedNA>)pW ٰ*`~}Јe4];S2EnX./ 2X_{<ry\i ͛ο3ooNPՠӱۻYs~{_(!z!""']}Q_CK!ty{=ƁPPRKy20ᇻK\+ o=0&_PΨ>Y;mG/zO= -8m Yv(B*wBPQJYjCKv^ZAa* ֘Qs-K',D &f[EHȞ|ө-0HR_Y%XƩ,9'%H+!/Ч厃S.7k!#Z *o`t!PōV@Ơ# AF%;.ц 0ְ%1D!ݻD"kȔ"ҥPtRAY2\d1"pS:_d%6^1` 5n`$49mG盵Y w$FiyG Ӄ/Ћ+@bZ>WI(nv y@Dn@NY ^NdwqŌ{JJ bC$4cl )[̺N'84T&%xCtЌxݓb)J]O? ~ljaʥ"lI* [\ ?f,|U0eF6scV9` %,DlbådHuq̲QV}bBhsRsjc0,D:3f_0$L+n88hVkH/s_|; )&,= t:۝IJ++^[0,KfKFfDr^W 1 o2,ޒW&[rے\lUNnýfje}XTӤ蘬"w孿"x{+Y|0e)usL#xnv1|ӝb&m/N@ߠZd2Ʒ?j1 גjsC#]wrSc%ʐ,vRm>Kݛ-聎3#} Ay[v:$?<@ JEۖyԙoZ>Bqf9j'ňC0v11$ZP( @7b wܱCaq]ě:/x|yV=4#s$5Tw`DVNjicl׵!zr[O{ֵFbWR;03цILLJ2HЄe2Yd8bsrL8fS;^SGJ9TJ a<<_; ?k woSAޭWV<Yo줃Ip(׻}kKf¯}'oe.{9 7斶74o>ol?߶^rd~9I0;+B9i8 _hs0 M/Px8o'O>.wo_|ףm/D#w+&8SAQjm_ٝƕXK׷n hA)flMJL@J,\%l aD|L&}`e"z ]9YȡiCfNk’%}L {Dn]]M} LӤ55O LPsU t 챛.ko1 .H9A[Q"c+clˀȯ x'"ge{H"&#ټ;5ư*F͊EK'Y]]arVr>R Ŷ G!4!b.1C&R^mCI$D ] 30uBH\+56PI3w Ԝ6W&Ms0wLe3{ F !Chp3gM)2"scy\}M/\ R׺)+c|#o@H?>z3 Tטfm>z{lj⢳"%#H *8k&Lcl)"({(.+)G ?f_H9,7-oD.OGjm%4wɸ_5cBgqCidY../}VK?jEntЭq@HEn@pNrbwM3H.q,/mZ0r5`xi櫝6xЋ"pD4;w>O) ;s̯u|9K؛S2Azq?҂xkbmRF50J՟\vANq/6pAY-9p)nkʵ l8.A]?NL0h2G|h$mĭq!~ЖHJrh Nr'R aϱ@VMT%a F)*s~8~cf=q)p@2cIB҇\\m2~b3yLmj?3qu2ϒ2a޾C%H3Gs>v-;q/M+c|͡]F<"Onf=[9OtO8fHex ޻ʋ:@zclOw%cA}  "kG (q\8J<隗L: ok2~dG,7gU.<2H|xåYV,x8BuyjOۏZ*Cz9?O ʕ< 7p!U,LRZJ$վNma2nכֿRe42]RJ׌b@ -r{UWlq̤1 u1˘E& ۜ pX1;;H%/(z?=T ?oC51VwڝWoFzomִF罵QІ#ƶS]v5> Y?\W,-x̆>=H< >_ 0>޶4I(u`ܔϴ)o4k>[P+4ҰsnfkLPח$rKfD"q$E"nSE.Xs]îDN/R JP(R%(KF#o :$@mJ}70m7R :-ԫI}sI=9Gq&)(ydyjV{`%y|/+kf_Wvp-Ύb-5jAӂ]`M G-A2&ۆ -f-2+cl"W+2`Z$WH8f+ W}Q{KpHL`( :  )I} F fnOڭ )VeeL -Jw^zC{ Ӡ9/9C`G0NUdE 3U1=ö<: g/vC@wӏICQM/N Us]:NE/L6W~@*4hl8^=߻cKsEqCqzُ^^𚑭2D'S/1lnzJ*YO@Doh҄ ZC=1$ؙjQ P|~c[ESSV"3ji1۽ɢLBM9%،DQ"&yRRg .u^cT{/s;?glv!|&*ZN =ǝ\VcL.#~/O\nr=LCXS>>Ges9LgUƤ.sRmQaq w7Au7]{Ah&˛.⍵//ZCmJ{ k ԏ;6*x)Su\hL]jZJǶq֕S>©mG8`6 A d^eCT`*p*̈́w2挞5{d㩃qRdfs |u/,@iz5V9Yn\,51(kI\_%t OO(zѿ?㻭J-4Z(iPB&۬׻ .oպj>'Q=|G˕,/w<~\44iriҤKwyg9 xCv!ޭXK4b-iZk;oz U4;~H*h lAH:ٯT|K-L|lcԮA  +CB0TfyCʛ^ٻڦ6$W>LmL(vƱf'6EB’Ƅe$hIZj g6jU]YYOfeգȜpI4h_$(ˈ>A*k 98 (e ^(,Q2g"V WIOMwR>K,+fˈj~xa5GճpT/7W/Gb>]&sȐԜ}*$).ZDZ퇅Y]u3!!)G!6Rzd4MF3d4MFHQ\sTۨRx8{c=@aXPX PBb5!ZKUmҚ_r)_ Y޼~\ Ju$è͙ZI"SQeE(doֱRGƫahps2'̾ O*t#~5Wi ᕍjߢLS9fj&fbj&f8c 0kR*]@*M$%dElJ8HU!Vg U 6^Aѵ+q=p Zl 0Zu|sUF5N:Ml2lSdstYKdQ1f e60[Mr!"c2 0RG($,._TJZ^ŵOl[w`B/)SʤD4|5 5 }ka1Ֆk%XNdD7K/ u ؃!B$Р0+`O `4RY!&&YQVՆ*y/Lcוs~Y>K**WOFd|FʃV46ӡUq烆 AH!EA3 {"֔*- aE%5,P*芍/R0*F;<h2 IpҴRw໐)j <"V 2N iD D $"LmjGՆ*iG @&+kokhLe:gjPŕ޷9ys`a&^Zj9͢ɠGtL'uz;)۫;v]&79/uZ!' ?3*) a@h#ґȓ(!4#m 1(. Flu7y:wz_Y&UDgeggx%2 {^5RץNBw>znoM}Lp}sW\do흼yn矦} {{;>st7FϐǞ99W/3Ѵ~޼>n>8:y;Gah"3wo[llp2}{~~ps>~}b|E? &Umy'G4iu:=?EX\Ȥ!c( G:c=&(FA)$.='Z2|0 E'#(zcq 2rjC5KBgNHF5g{݋Ǚ,eB R&4+fB 2.7j#nSeƢDFk{bЭ<9 D5!z~Nj f)1]f :{N$8;0E rAJKS.2 (I-Ȗ?ȝ2 WY2&ҺxˤRL*,lSzaIEK5Ғ?5Jiq2j.f+cZ2 OЪ~R‚$skG?? rE,ƋM&δdH!2(RNb)sÜN@~sƀMD*jY$ 8)=%./^ ʯ*;??4"KYEV(%܊$KD\'S#,d K$Yvc+ILW!N2&rPKGF;؋їF'bKk,!)QIR" .7)1V8ı%ȱUFUY#!s})%ET+ݴÇ9?!;̌T+qN̘|1KY"_bV$_,c2U $L!Z\hd朔pd mP3^ 1rN_J˔p/T3`*05+h~ P5Nwb$v L% s!*$NivzH b5"g/Lȷ %[ʴ%|է9"it6 JтKYbOΊe̥*|%sDG:}QZoP6[ ^Z"jvJEVxa-tcmgU"u1* nH20}:NS&:3r{e3bN Bd"G.ԄQ@xVs@$DJrU`0]FVh]Z8!R8(˹ӒpJU!*Xh*kmlSVUqL)61d~ HZNԦ.sf%'ej((?.cC8(`+D[ʉaVCX Ltoլ{QyIiIӡou׷!RZ8DG+1UĞy$:dװ2B(c\AE^if%lQI}پ ωVyǑЊ̤."#{K]Jj\W;{sI_ZK=~LBZ4]]@?/\\?R'9Kr>`$C",U٫n$&ɦ/K["8ׂi. {엨SSk,Q8V됢τ)cŬi OE_bU!)Z>Qaޔ;gOQT.IQ6[ D1TDSgRs9Ŧ(Ť)oY FdYur()tvNHy7g.>+/ڊk}5w\΍- j鸐FLJϮ{q=bFNs~_=ZwS.# LDStH@=Kڧ 2Rz$)k*5;KblG;5;`X5;ʽ[JiyvEսXf`th\C ӆ.=\>yp"#_nZbǀZnfP "+-x%*$^ En3D!)!+ 0C"Ps~uTG@uTG@uh g^FEzoDD>9P#8I4v́5?Z`2`B9Ê2QX~ء]֣cy Q\,eUo/y% gPvL6S(= PkveA!{5pm[9Op:ɓ œ ?'8?Ezy])UJqz 'T>\? YtЩY|Oo `N|sx8U=jn;'ճVOrRI~VYa-L--S(7EZ !զ+S$J%3jO$9fQ0![K kYE4`$PRQF=*Z֚ML+q~٥w) t.K'ɻt7JivR6<%eMw157NAd*($.O?6wlG#ة'pd4poRA8N~<2u@tro14 I ,$3Ji͖@,lH,D'RrRrCFg'sw2s^NdNdNnN`(md8@"p**N)+jx@aZ#m5&J!M`|*w˿WA]̽jݵJL?~;W>dI Ѹg>/M<|>x-t0LBx֠^>!F::/}uR,Sk(K^v}n]b~!|LqN?#NU&1;'!V}ݓoxKK sygg_%O*y$~6|un3?=槞2{\WM||O7vs?WDNՊjWvʹH\tx}'+s~9FzRr\C[OdSw]W]dn W40Ve} 7]O.7VPHlT]}rz+U/ɮO˻X{Ģ˚h щKxۛKvxX8(YfA)!ha<l,lL C"qCO+-Dnqwy)L8yWwy]^qo7oֺ-2.Ks\;Y5@(1}x#~Rٗuof`iXǟQ߫glVݭbAT~pQ.I_1999'?<ݛ,=&<*Qҍ%^zĸs~-ci?KJh&,d&O~4?9u9Ǟ|cJ|8̷~kZ9{V_}@dxieB)Igr1@((ψkDf?-&`~~~`MޜO@@=JZg6d*/=tň8,_FעQhrѨet8KQB}9p9+iS($0J'(SDpɰ@gt } t } :?[^}1\0a1w@K΂y}Duѳ6{g %5F{р Diټ N+~bQ?< `D-`("bƀ|VL8C'Arwsm^umx{ۛxRgUϸz^S`ՒG fN̿= *Yͯ4O &51Ѐ̡!THp,8uWss,/ިcTKe)790oگfܒzNCAY6/QP²Õwc U?ɢcr DYqLLY]%tc֚ 4PYh-3(#*V$'b1Ecz.j ngLP)ߏ_ db7Y)TM6ewKE4AHpTBTQhsBqt= 9aCfTF }Qgyη={&X`ЧOMQu[cade#}ЧIjm SV {Ǿmk $6($"7+swⴭb32'<"YDuY4Lɂ*>"@'2qd euQlS,% dlqTb|O`G \=.ɻg;gO}\, QsJ\ZԵTY5 ldce"z<,5YR@-ExS=zJ9Y`|@c*{Q"A#̖g=_p:_:{ȴsvz}fNrZ9cEo*NIlQ!+vnPQ$FHĬH U)]F޸0䔙.d䲫D7uf' ljK[r_ FyJA7 m[j4y}񓜲mA# 1p4q[)tLj+n i7;y1m:iGd.;\%Nڃ'$J\?# pÀPKR%2j0Hӊ mojy5m"0]Ar,-@nY;\_m(ǿ&hW$#¶VjtuL A| }}Q[X%1w}rrRhe_8 @'UϝU߶ux (-RV`254tU:=,hKg%6hTkhhVQWm#K-k:C[o,1 ,!Zr E 0I0y9mRu `L@5k횇 IV1^U#[^`JUc8#X5v5 /JÿD)?T5mmT4@iPYu`nE& GFf #9Ku NHT{!mȼ޶FQ XBLGdw-4z[o 腂۴ϼo݊[㦲Lc[7L"~;h) tuFc2`/d. F2 zj(!yvʪۍzOt=Qmd/SA[ =z^=nφ&/=f9BR׆o3^ճ 5{56r8eɩ`i͖:7tY~ImEa[{JPۂFNe,ֹa5"z v5=Su=:%?j?(kة{M|q[v.k|6;io~y5ՂL? ##p7ɣK'hV\62mcGH)`<>ӡ0|Z-۬ͷgl{Wl2/_ he.k"~/'~) M2FFxB%B:%F^ACOJD z1'~o'2+[o'bumin PtEH/$Nˀ8OJ-n":~ދ޽jwqIv'meZoNReݜԋ*ӿ9iBwt*)Tu8޻aV)/yilޢskmN#ɲЗݝ~(`5 x'UYc4 =fKHڞQGTysqS[9 3`Jp"`,DdBXkj~zITlUyu/{mvV.,H6q(oA| (=tj;LH>j /&s gb}*| |H(Z6rw{~[? R}eu֛VHhZtOx&Czڈ>ZZ'a'Ce}}c4Ògd0/Hz z7WkAhp4ڿ9G@v/. '7ۅ uq p8wvno3_|62]̻~8)Zz^8 cz\q\ouj}I_uV+o!j6.ہ\ \;{;*s?tۗ͐&JAkw/f'Xu2y~2hhrk?E[? o XIaGn8.vp[ARl&N?0;qY 7a[Sn@^vκFoN ,tخvo6.ίKg屴~+L f>o^Q۹v_-SM}.BUYŤ(vm;XfVNf[0nnG4^\Nrj_䜅̃p}h^j2Rjڠv~Ƌx8܍~?1ֺ^H.nWR=XR*_;mȱ;1j\tN~㻡:2x0*4]éy蒒]6);PM'4ȤҸN3.74Ju{ 4u= ^.:oz5Ful(zF+L4svX*;jA`iՃ !ē);V/롖VN[Sro_4CSZ9 +kP:.^ ]1}]\_1^+*۠vzkʎN_T)"AZڵ`f5e] Ty kjfPp̕kg0e%`4WV(ZrV=K)dxrWzfBg9+hTN?/_K# S3tA}hRH%d;pb/(VՎ Xfyv>;Ї췻a0_IXl֖+bӽhvZU 6QɌ%ep6x6w|}l@;3񝹙K,cbZ>Dg7GKy U%|I@ppܕ9j0ZnD6 /30 -e`[<.ڵz;PdSɰ6Z?q 7Tw/eaթBxۇOH=f=&VEqeL9h4/!-@GvRxmP҅%EQGUz7u].Hpknw23A Qθx)1_>]_J̼y)1E #b <{{W #h wV1S q8f(2"))(UgJ`S$O?쎜1qQen.^Έ.^~y{RY''2_䗇-Qm./ݮ k=cV'UHEMԝJ&2Wxw% b~ox1{+^ Wv Ĩ&6-L&i̭uqʭL8a(,!c.i)D(MeBn/5Kyy,RII8*RN+V}!V}x .GȾv[_-\2}o_o.y\g ܃YH闓_89=1"E91hĈG Jcs'{G( iċ!Bǂ865 IRjjLeh%Z454Wa#en.(A0:gg@g4it)1:ґHg#8Ҏ(5a 2a;@ #efj N$k$)f3R4UIAM"L/.4H(2}&BQg#ES$r)p&(5L 5 ! 4BT0jc'TIp6; 2brqxg\Cfm6xߠ 2Z[Y;Y42NbCLĀ0Q$I✰yH@ d J0p+,b K:L-Doϔ*bkKxfg K(y`^a# o^1x(fqE)'iǼ0@9ޢH $ZX"K :#@d1D¸q9>,w1K4JPۑD)AޑyБ#PzyFK!MX(XKIp I DAb͘XAdIV%<_) ҈xr1Ǖ,yhS6+C5a0}02JFo:}`_;ȲGiQI!.ˉS sO-XPYP )̄ǔ9- SSJq,e_>[|*_eg[sgPvku?-cEJR;'HlH$hRS VH)#UУ~ 2ERrpz0ҜqQRޛwCfX`ʐ#rωĔ DH<-}Ӽ4<$kyQ3\a 0x&v -KY:Hʝ$9P8/MK z|+G<(-wpHh4+eR@}j2SEh M yR>X@"vR/iF\wCFK`0]3>[!=| #!\KB_*J00wG0Fe('ͻJ$ᤠ {z?,9=CK!TIw%R6 Ѯ-?Dbw"(?-!R.wUl reݝU0U&Xr \"Md-Cc|?G.Zjѽ -)ZEQZ x9S`#R%AHBm# Gj[5BMC?R,2GjYBMXHɏԴsF=}.FQ6nBPe%R똁 d/\2KvI/2&p̥[Vкd[nwq[&(GZ B3#J1ՊSY(UZ6X}h@*C(_ JC{$[x<ʲ~6Zv1!vi0XBrhś̼ MiExqv]ʭm~!Ɨ?R6/ 8h\C1;s țk6'v2ظeRqr<ݧ)%K+v[T~=񗇞zuSjuדӊƣL+~";$8J?\z;)3_M(دϸ޼Ywgdmmt:jݲ`⺿ ,XԵU}^O)>!&Eu R<|0HF4||OEG&HӼ a(~IܛкP*Y7k 6%aV%J]#sTvld!SudQF 1s+2+*MI;&|`dr*[Ɨu/hjpG5GeC)|Q+]qA/,_ 4vm㧇Z(ĎTc¢jYY1iqWz'ÞDo\8""o/!:hetn&iXuSMPӐ~ߥuy[ rOw?9228aH5'!_l"N+j$T>zz*ECOyඅ! GjI'Ğ|!ҪL 뮘^J4=a v}bzlsPW^=-ͤUQH\Tgjs/KzxX2. FS*I pj֕M%\-i%js{~~#WϽsmԌz$t5tW??Հ ^g|[lӜWUkҧλrg/~3UwN$'mn!ڐ`AG푼Sg< AF8dQ +*h, Q!-K*W 2[rj?CgikGr{*@ަ#@+1/r%նٕQW*IQ=&~5^B*-RyoR*hpco?y9ᆲaФUml7VTQ5lV3S|}ȠxQ̃)tw*fnBY+JkAPW-$ACp[ #Cfj!4a#daF/\ ý:Ao<#AX#}F{9`ܕdu[u_}nQ3¾QD*/sw[a Nl~x .>rRޙdN/29#_[#~?\`O40ԗD Ѥ:^۞#߿|)1'JcA(\1 Š4xU;/Yq?_[o3=^' ]\5I ⁲ocm,&.;2v4U86.6s.EgC ·sȞ]W{v+l#ϗCC]XS :;00v|GB;땧vޘB󆱔ϒocY;0׵2-?@r*Qձ_s>^ fW?׿O?OWnuns6W?Y7L}}}=#t.oZ7m|mjdã}pr^B?(mKRc-uelQ*jіZ2ZR)xM窦ZL ]}ts͓|O7QɴVeAi2߫b]~3jlKf eXƢa1R]Urw"5EĈ9h'R(egx .8q$njI%'~׃Wҧ'H?$Iًj48I~{i5%V-O!ŢS94ZN"7y߲z[ln u߬)Vɲ>CXD}}u۪$%oʛjZ˺zPID_$ᖼUjᮮ  h"/,0] #sk]!cƞ9f/7 l .cf8?h<&yi-?#&mHYFXcϋ1赯ԗb/wE^Ӥ7<vwcVh-*C%RhUZJQnkD4Td!LS$ cG)axjd+ފd+>|T}ww*8sA8½-#)3i\$쮕 ; ǹg}gFUsϪaԅԉ#ryAQM+˜\8M'#s̖ nb҃ yT -Bǧqⲛ '(ƍeR*mE@`I\a9)$V%Z͕6l0SOE :3gAIEr|O 3*u$E)L2GǛ|^JY )|S۱q/ D7yxѢծs[#F\*!4gBS9تQ3hSmXjeJ=iVnaRn7#oBVC?87LL3gydJT#}ol˼*Й®y'(.d#3dZZFfJie-QuIe/Qۇ(I阨Z^){XAc4hlJHsj$t*˻0 .9'4I6,=le$1éi]a9wvZҊ"V5JX^B5(,Zr悓Ģm `XbAJPzrid3Q!Br/K#=]f+iOٿ`!5cΑA` .](݂I4tg+t2wyAS+2fY %Y;~VvBDZHM[F$<$>a;6 e>_d!M}~nVFW}_6@[:,+rV4*~I䒠6ZA ڐ#s8!:C8x,]L 5(p@&H2GWF \S(2kk(5櫽,|#nnˮ)BSFMm_Dk)-(\M\W^Y(JYܤ'֕2+59yBݷMxF^~ 1wJ{wcx޻9aF O}F"n] Vnzqvdw@ˆ Û'Q.8H mJV-;@iFW 2xDVC@e*:)!p[Y' TYbK8]QU! o7ǎSAXliD^H Pdv-Þu tj,I2[&¬bfLwsmӱ:K\⊎;HX%l>ۻ5ivCjofoZҿt %#F,[nf $a삲:i%-? ? W(#ެ1r04sHDAյ֔xŀItN%(4&&NƶWghz(5kC7G9MP zt>O9dž *pιӈ#mxBf9{| ϟ1Z3.@#=y%6k9*tXBH-T:n*Nnds9Rf&Ļ$DVIس3br5scI}>VfpJAK)4'kg8"USĦ*YqHƙa,0u6%؅ +\s*7F.n5A=:nWӨ z res(ʚd^a8Mh{)o^ۥ4ZU˂`ETY+ YPےB[ik;A.o$sbu$ySS a"|Lړl7I %hc5{m΄/|$1vHEn*h9w^Ζ?ecZ'iѴp${Pamu]<7dUEl{iI[Ewkz=H>Í$x|wh2c )8j/3[ )+ڈ( btY͍c(wʝl{h˴j;q/3V:"Q'^/IYaK%z{n3=Tġ$RufOF(#nZ1KtZ=DSq9fVqU;[&VH@5qu-(jZkSTj=UK}p?|‰A֧:_Ώg)Șaw$OB9fÉr顿/nrqrd"՘UBS*vSNr`0nwUDK, ©uJ FM&"ZʰCT (S恩[r>3H'BTlN`l*vvfJP*.!LydƪM}=wÚ/F/ee cT^;_Zr[Y+qބnOpr I\F~xd' pct%X>F;r^Eh.0aҁPE$LXʤ:)(!C'>G"$Զ>(kʛL`#:jk,'-'렠. 2SVf/wn"hC eH:bٍ &{Zn8zr5%ksA =lԥ-ވ  E 7D9е#|^u}z`l{s-%=s6&H0m\fء,5;!W(Klr'i&7JKy- ,ŮS#?Lynףe|7:\=\9Zby,({Q*e[r^.}S*qeFBB6k7їHlI>clB(*nUS![0h#k ﯙm,䀰r+S 8@*.^ڊdGܿjN'>Fb.1lа )  3+>F2"1'tio^pDV> @%Rk6𸃸uOFj=b&g>(wR#-by3c2"xAĴMA(ĶfG>ʥ^w7`~ u!z )tq{jS+#[7R1 q0C2Rh}6cmQ4Q0B)iIއ^hvِCj_ClD~|'qm0Sez k|xO=&~6:f?%qg8J:I#O=Õ 7$pc P8nN|gWG=p~~|OpjSN'/G a6-OI`%xװ g'/+J~ap 7jᅥ3ݤ_pÌ.' =//~ag=kyWBub]5pñk+r|߲k}W(V~ xzt̃nPҠ=Deopi&/v6x8!Tm8x3?:N/%~Y?<+(w_s4(VNX|& e&0kpqb2%|5K_em9J ޲vLv(7s_3(f!"P>PkĐa΢BEHPUKcÂ}ilXMS\Yﶱ /0Rmd\.0[`Fxh;jӱH3pH||1[B9w%2apZ+_H348$f ͷaA̷a7_$3ߓpͬV2,>V1]uT%!pTUՎ`D%(>eGrQ:MfDr"mM jdM *teՐiSeb,}d޿xx8[~M8Vw,uM>Q֟PgdRsn PpkM,8UyiA>o5-ަӤSMw~r|]n׍ ɢ>}sxHM{U>F)T!cXVFVs-mF$fQLKcZ%k]S&36 ˌ3ˎ .}& NNQG3HE XSÉgޙx˚(Vd#F(,ʰV6cpM3d4YdT-J ϩB(' 1wx/qk5G3IAR JRPUSP4ijuAuLˀ3f@͈A:a!u ^hFDAdK0'K5BzO`k) z:9>N~Wf+žqadB0ITQ^KS⽧<j]eg[83Jx¼e&|w>f+"V:-FA<,8'Am)n# 4ʴ"о)4Q;0bYS\ikybPwC \:\Yp eIi#x6laMHj"lB9b (S<8ʀ18V2 =Ko"4B0EkT FcXi ݷ"M(pvB2dfCsġIX:;S\: 1+b R!8ihA-UA8ȃ_#p5B;S6+P-MA}AV,9:#@JtW|sW 0r NLysoI?'>b {On} #3F%䞭/賥ț5]1ѩk""?!8 չƮ?A b nA_vjnK gZLzh"6j8~,uĹu=T}n0>Y"妖Q(E쁛oDr}to]}apz6ѫEjbT!ږ-[MT\fDvY&OjY} VuQ`H'i#)\ 9Xl)ȷI6&1AMlcDDZVha=hO&B'W.X~$"lo;ҿArP U/J X]:2U#~FIe17o\(PlAd_zGRFlj z(ٛ `қp;RbFEV5¯QAF )v&Zb,>EyMBd5p-v# 1"AݍЫ]E32ia觽V`o4jUzv!g.Sp(/t$6pȤbk*}Q-V%Mn.,W7P`J"6™^?"G{X%5hW! ҵwADd=5r+e/$*NnRnNV69yIh4cX۰Iml=%x+Pp?'HӾں ;c9 H<PNI..DPdYL\Z@>Z׷bZ~-oJ`GX"̹ܚnsn!Jł)ߐnA[ *SlCS%abt ύҹJ-VR!8'^\PuU6y; •CKbR\{aLUB@&=) XOy*O$nd0$@^L/jMd.(mdl.^ $N (fqaDx!"pq5xjbGsyWah!JB}Xo;8q&wC.AL(J0:q!S.X/kVa'kO]ߨ@'y!jvnhFg63]ҠdW|om:m,E2bۘ1q͛1-dfce(bK 7dLi5܅!( )QEK hR;eJxjHėYC ZiS.k 0nq̯qh/]Lj4%cdT$*1Mr }99XȹLLYIO٘X%uGⶵ7Xpҫ|4ȟu6jZАԕh?EhSO ^~hūx! IaMK ;B;&r 'XŴ!-t.z1'*/AJ\{1&U!,;BH bTxA`:1KR.&׌0dAI*e#."p.1ąVj۽zj $EhB&\th-Y9PQTߓ7s}3w1bOery=<81iL׎=suy=dd>٢@keg]4W>L s%lXjZԳGMףZzs3d3MYf)hS$ I^Ǖ݂:V"P ojϷ>`aE-4`|@reK` I YQ:u_l,p/0?mJh/.c0%=pXthEEF_"#ӲbyWA oFV7b$ND \B&ewX ,3fyV]ox4̎C{8 ˱L^R ƚ5R(L/*YelӓyC 'R h6r/wW@t>^SXZZLP™jQ8"(hrm\r5r9-~|7'r˅hI_ݎٰ(ӫi*?gt_ 0[ %)P7)JSw_=N>yg}¸ʕHq$P:3mdZbi\f )tQNs2ITW$ɹiMֱm9P9vWpb ^͒?wy­ Զ]~@aޡi0] ??<@v}Q5G/NHbu9TA ,j!|>WӶFh͉θIky5rhK(ۅ+ƣl4)ͥ(=.(vڽ1$'QW|qo!xpGOGID}߈o%g)ڷXw\#L4W?! c~W>W=s(Wqd;7|\!5bx&$/1KI>?a}!DF/G7v#K%_K.æpSzb.Μ9XN㙝l8>_js1ۥ:=.?JnL bXO.V$qxtLO =Llru3~3BNxr|ԣdf'N80@$ rD( Q NhK="B_F%l7QZ:Ƙ[{C37A(stê).p;V~\U`G~Uz/3km8-a'buYkq<o?y̌6S-1(9t.|1,g֙ey.ͅԙXEcZa.` +ұB17/ö"=ߝIM{2 Wϓژr$,`)cRO,/8pimLWoَŖ]d c1I5r r, 6dH `w;)tcJ*׷#P@N#ʝZ_ΆJ}'p@5}zai =+3Y B 7{tV\2KH7;B ӝaBJt8+병ml&n컜">ɭ^,N򛅻]>?©7Ƙ)c V#=0 tN"9n>7=HI''"_oI1Lg˕f~\;#.RB\9<c\,$$#=}<:L$UN%K9Hu0B{>v3n߇tG/Gfy bE(L6,AāBqJדY+Lxa.d:tLO?:+2˹X,9g/9{ )[n?zxMGg[3B-0I-ufݮzL4eLKە f2J Lt;""Q$ŅQp(xFe&sC sD%|?{Ƒ .ry$>,v<D]#alّx/RSͦ1mXH;SǛWW~u؅n+)gF87)YY75g7e"^aG*ExXM'շyr?#/Xǜr%rmÁT̲EֶtdpcBcOVsXIDo.*ZH*p2@7XjB`펈[#yk$6oFb}.c>[uζ ;Y|VR)ʜrɗ ߩ$+c>Se0W]hW]| ^d '–`5;y2*Y!*Q"L rPTk(`V3ic؅FG<*L 9sSSMK+dE% ~*!tg]:g* J("Hcc_;*ږ//;DPoUX;nZnsb8X׹hCfHL;bl;s40xX_$BJ,)FK2X˂?v˂Sĩkicp 51#tFX`von-]9mZF=]zt-Q8 cTB&߷p,Mt9jRuфU3Ҵ!pj ESqPNB5q,>)-HlVmSk!'bޥmZڦkiD(D ^vEa(J0uBoT0 w/W 0L,B(h[ 4(GH&K&1E!\YDF8#}vsc5TLvzwoKhW߷#gjZo_5\~{_/~ݢAQkzÆѓyu=f.:څPspV +MŇABZݚ)mӃI$B dLuR$U% J`c {W!Ƌwt}+%v1bB;3*pwdEPL>e{QXHocEQ ւN%4 jbƟasPN݊ux7^:Ѵ\i~ y(>#Y ,Kޚ*V-_"gxK#xѶcΔ\5^"M =d@@Hֽڽs9gXW8Kv"tc;A)Ϭ\f" 䍍*ER} ėJJ(TW$< 9y:~UP!*P GpYN\;(J`,850G!JU*DΫ 9-,\B)8^uϷby%ZP'Mt dYj72SL9:&ZÁÕ_ % DGi<2H|GvՋQVٻ6#M/02ANB<:#wp4Eh m6p|+I6  횖kv%R81HB潽* ͊;+ϼRnqgma',Q>wְeED~;qoЧ RQ3Y\4XI!k=vsg /jAWYn܆Kx{ k5۵.W`> عA)qԖYu^X6 tiX=GV&~[#ShMy3jܫJcx5]d"ʻ/-oOϗrϲuRhcPW7hЋHcKc~f__QHzZ[7Q_)Է%-;hG?;kO^c3>6'ﶉ¿*Ͼ[<ì +Қft_6+lyÎ}jeRI}mzRv0vc}yS`Pӯ].kiqk繯ΕM]͋؅dW)dL_ak5aL 9 Iv2^.))SI?tCW0y,夫b~Ok<9}>rusݼ]~O"m]SڙdLmj/PIktAv_gۉGGp .;{`E|o&I*/&}̧SJ5UN TRqd| )?ߟ"]D87O'f#O6yO6qMi=y?@6ZﱕT99Ce~JO'I.' .@"6h&Dy*5󊎨.a`vK> "(d:o>W෯>ގ\+|?6[ZF"ɟ} Kr%\׺D~y( (Hs`r4m$G>]pM [7nec8b81IU@+'y?߉t,_7/ۓA♓_rx4È;?1I_ 鬙kd;qJF?[WRY)[ !|Dy{ƣ_q+n Ж5^>o|e^)?cAx!|Q$Ƕ\Oi[_mPǒ!`' xБ#B/IN.T!:,WwId*)s'+,8Wi}YOO~Pje׈okWޥEQQfcK/ZZo:V*n>|Nۙ~}6/7/}w2$6Lt䒳T(MZ֔FL&&CEid#m%RHVF .YQ|1TAp/y2!UI#3&r9y] 5"e2RZSr2g-!mdA%[k4XT~;&;ڞkΰ9@)4UP:cZlaVa\Rzu&{ ªtmz ##Fdy"TZ _huQ 3pWxG&8BF!0o!#NL2ACm`(5YG!ę3^W^^3wCG,hJ}z=upLN#dwE%U\w&blP 1v޵dNMt*ц5 #+.?M&^6, w;x7]K6ƷQ2 ߽x| ڈ@}9dd GA~ÎVvчyHfP<r>,`FoCI4Y*%:A,9Ÿ=ǟ/.o_r_俄D92"Fh+?tz\[*T{[t"< Ҏl~sH}^,tլ׋Tn}3>N h8&w+@&W[pZ*͟>JۼgT)PK#>{sUq]lx~hc7bu[eDϮ߂>ŖD w:|uޚ[syk:7G]^!]]**YEaIU"Zd IgS 'ڗ_Hx3V/.έ 8-a:n>5_j)\ftȣijTy{uGu+sSL4S{[RnƱwHћ o٘װVR1mN+V.YS tܤ%wK/n(!G`0J\*t⭷8O&xS+0Ž!pe" s9@gy"x#, K.moSOaw_ݴfte<,Jz'yЧG  6|ib8Po%0E1#|s+^9Lվ}i 4+9'+LԅZ+Pn9dz[SpAb9/hq,LiBXj*!ȄxrK`Ǧ}s8:;pxx*yI` /ɬl2FYea/ L d)!PCcj0nfiZ[֨՚k)h+=Gmsdm=婃^O&`3T"tMu ps]y6eRp3 t6N+Fineq,2q J\2]xαښ4&5)J*>H1^CX%v z_''Ƶ~+0<1p7F΁hYBL%X`G2u` yZ#t.ꓑuٮAl=Zۗ6Wonss@7.tH+\vY&/t-WS.q7]*/#ɀRx<̡,Gh4r \/Z-LK%vXǴ춘zvȻOu*3KfzkX(H lsb0C+Shu0p91CRlXk "z5Ptz+vI}<?)j1*$8 x.fJֿA(7/fA֍?!r!4Vp-˒ɏ7x䗜HHάR4Jk$̃X3CbY[(fz55 |.NXS `ZiIFLf)Ҩ1HCm&e;N\i=Y&Ȭ7u`ΘLyq k'-O\$8nݐH **z)n; 0JٻO;'aR{G!?qtZvXZe-y2>ƭZ:6 o~T+@2Ó,ú g:&(BT ᔳ(` 0F2ī g\aVHy,>F ;էfԧ5-~z)wQ*@*7|@7RĄ0$ 4'KUZ`f.MCwi!h]bA!)|}DFe%WJF Zz3o[/:Tɸ! ܰ)$ItQ` b£^L&v̙uޱL]w' Mc /j0=͓Yd%̓Y;k S_Pmއb3]Vkd*q%jÀTۮf`xO U#܋%KhZ6XWy4 ݌&shBxFc4`A|[ toak[k#s4\`h :=:rЉYp_0vzw}Y>8J'\p^;7|srMO&tSܳt;06yݕb&POB\?3PgJOl}`!ՊJ?ܥFOR|fQJx Le~$Ga},%@P~d -Z˪ُwi(-rU U+*_au??QXq4.NoGQӓN t_$;wKc:k%UW1zAg_ ynǃ?#;!z6jCXwּùn0zȽkf8Mk0Wqvʹe)$!׳#6 ^V'}э(9S*i ^]y Zw$@4/"ئ1hV(/j_}w\M95'wwz<#;Bb_Ԣ(|r| >]tȏTn86)ɰ͌љNt NKnSb3BeJ[Lf+ !R0*ԉӏW]rDPv~( ayχώau̅6k>7+٫2 c6旟w_\c# (,#|@lgĵn0l 8 )֪̿tK:R9bmБS B^4tDq,t"6gx81.pgd(3 3VfkV%XȑĿZ@$m!笭j#uX<?_ԅw8måjv)އژa:InNO[']ko^sk:ӈ {ݩkLزj "wa,v:LK7 X}}eM iO.@ĥ ZFLX/-BΓGA+PV {snJNʼn5,_=^cctBl:KGiŀg_81jmljM,fÂ裍$&VEt A:1YC IiZx>xU/V-ܟZ:\~YyqZP#&¬YVj);iɑΒ. O+~$.K{{ xc˥I/\*4F/Lrckhh ` X_?>U;ψ=;_̱-~D*yˣ0X3~Qd2AM_;pKFɡ\B9 Ri#wx@:B{GzhhjB-?\wskVRP P7g\^fF-+q3P M_ D ܸ^G-tu P>ݙ?p |RXEE~ Fk?1o P  y..Q^^Rƣonn6c<>L *'5ZI'Q }E*plާiюI)ilW(=~azx3DuW9\ >jL1q77xgٳ S[i\4(p{MљX;~VݗVWpi vpp Tp^{C. ^bEI}TɅ,"JTz&M k"N Sh_M=ۺ=t:MdV|8W!Ҟrl_^SZ4iWNj], yS\IqMm4np4Ip4RMyF\;͢<@,eҐTXng`iF[bV6:QKv̓͢à4*)x\*p?z윝}x95[ 05w}d[rN+>O;~KcDYQ+7$'uv 77< X&չ$U3JKw77QP5{x-D~jEpޞ{ެBL͙s-߽yBp@@B{_zKю4䕫h2muŠ}G8ukѺ֭ y*NiB gUΗ~^1pXgx]63<2'p3K|1\FH2$0dR3&Ĩͼ;bK5%:(ݎ7){gU>R:7[hh1DbMcT!h_]Ent;̿0Q:6MF5|ޔT֑O<֐6LmS ƜSv0Dl4ꖐnpVnU|̩IVQVaƊεK._p][o+^ lὊt v&y˒8}3#5 {3cˉ.~u'U+w]lJ[aV=e䶯oOmb?/ٰV=A C0?ۂܣ9_Hfeƥ{|޺F/s0Jxa/<|:mnHp+3f >vۋ/آ sCSKϋ*,ER65Дn)сP.H;8LI0 R~$G"j]{+4R U0Z#9JU^>j>C,yQ3$k_G$dfd祮m}rKcdDF\{5}`o>^pŐ 5AĀ< v{`̢'@5RS)̡Mx9Cb12p$:Ⱦ񋈋vK?p'Jz, 2BU:VDP>C+mgN 6cDN9U'|A+*0otl[4گ]pW>^cxWu.՚N dCT_hihwr)"W t{帯pLQk"ouF-J#c(xkpҩиZp|p2K'Pe0w'Z3uOD:%F9l;[w].'jm$*r^g4itPt@dfCՎ70QKk:ޤgF(rXHcBh a:Y u/5dfwCB!fM"YDn}h/5ˆa=}_DJt=b6A ,ǥ3jջMA@!|BJ6SH&B}\|!غUhZU>34 K='Zc_X ǫ 'CtPM\ei[P+a xpʔSW> ۝g-7U`W^ JE JȨ4)W~iK=0/?KY=g,eʒoD,Y$eC(1!@G*;95ԅWq;^醿7j:qcjXkv+/rBjwIGze<μ!>cd^^ ÁS[dك8 @4 ݹy)hZOX6MiT6MivaZ{>EMA:%Dm Aj)pQۨ,H@WЄ:{>E释S~|g)47='I[Jxm߭tO0Z 7]@eNK`d7 S E#8k NTjP<}ؕB;"RUX$z[ll®Y|amd5a߾޽PeW TYnA8Ym*Xv{wpl85Sm;b bj*|d 8Z;f8erˑZgxw$hqY^kC~f(PXIR\ ԐGl|ˀ-,[Q/\{~%*fΕ͡+Q]SvS9.ؑ~|d1*nbm崕a]`9lr3 {O&A6U@(6ִ1ze4 4aLqfpck=\4dNJ= gLɢ-spuY20ځCN6.DG-Fv㕫CN֒Z|)-Gԛ ߝn4bS_`uhI.A+}!I)x&' d&r¸_@1ɇq@V4vVD1rJMZykeW9i/(Z'^%7r.s>]f[Nwh>G8Y#ffp ՂȶR=up%7O@ XcTXCȣ@pf3FVDLԔX*gt:[tkH׉t$9br8W# `v_6N`;R+FJ@o.Pl&:7?E]|B_NY쎮=c.tS'K݇ 1]1bm̼qO$Z9T 0MT 0mWoܫ*ы@5wZFn1 &?ҡ1&F2Gɾq_YO5]RC 5V* `˚-cy9 PBoclPq5z9' L;kٕnhmrh sf[E 3a)[ymEEhIzamJhlOA|RCꢲ(#2i*#2imYUP HhňBi3hdN%xtʌ 0Bzg5@QhC=i$C&{:碓6,HnxmUR/ЖJгP7 Z@QkyIՂa%ܐ)S;9ϺwzB%%-OI\L;NCLdJMjfHJ<{蝝h!؄k tdK cݒ⬼O%ڵD©;޵/5xsU<?*oENyO.w&\*l=vmlS!{T*Vk*~;inSGa$hJЈ.( )HEÿ4kԕC6k VhR_J&ZP7W/%Jw%FBDE)cz=JM?wV:`;f12#ΖF#7朥.A(G%8u&Š4 |]7M``ꚳaV#]/)|{6Mhi+ɮ W>Mnl\4iەwmmKr6[/CVR}īe`Dl @#^D35Y鯧{/J1-SwQ3Ժ=[} XX"}+N;5iG}efbPzҟ?ܚ=[7/W7+XM&*Hf\-E ak: "h鼶*7t]8}Nh \j3u3넶e]|y^dwpY mjԗ`7wG>l߄BP4l؞ +,=Me DV"+0e*Jq*k)ق rj,H`;V G܆$gr,V˪yDh4!Z)9X i?`P{:w8T/f; +dy+x0GsMUI$*M"ȊM0MVHxtwr͜]*H^~%ϳ5RzٓC̿"͕F1|xZX"!β,$<&a&ESqTrL}<0,*0Q̺i#AWu18ˬb+`Zl-/"ޑY&r,`>Zhܨŵg'\ B% P ű #*qȀvMFP$J! )$JA(1Y P&$Pr Qe '+" ܌n.B#*~"p<0gIw-]aBT9VXTiz}Z~A:^_>~Z ڠ!#)AOv-˱n?/F&NY{5ؼbHy:L燐T+nU4Y\wE9Dz|luC7CT MIӭob,Q 8V9ϸ6neIHyZ}slGXfm7n5k^ـǗ W1Y%ZU y,Ia=QV2k [O1C`Wo \4VP=x}> 6Bb%@!OJ!i\Y;j/USyc]E_O=D`l9ሩh{9D0ei`_APFJL;_ -zRPk5xaޜ9WF> ᩱo0hw n/[Ls1PMX"JUV}./&@-mpqJP'*9-Rԩ9=Ka)*zMa`T gZd!fP m84qƀd" &Q[} b.9lK}E3AZtZY"48;MNT=[9C;ٰcvHG+|ZBa갟4а ;*kl{`y0cgVFG;`λ0$(ĀKB6~|3R[]ћur$+bިdq1ZӻN*`d@0FzN1A~6 :0SDA*ZOOm|a.,g=~f{¿M7|\pwuޠS[f"gD~aX"16?Xn$_=ݹ|o'S#&¬rpS̰Wݥypcfy4bcγsUuvX?Dk&4TDB$Œ) Cj_'2ơ&$?U\\Gy8SDu),LT@0Eޛ$ˢ42a8`։f((RG16&ܨ6^/u馾Q^Gc<%yuTͣF,XbRMTj@Pṣd 2$HC"y G IEbYb_O ӣ$,5%$$H2KriHd CfS/H(|H9^gcF6[@anh>EԪ?;neo//0띡,/0k=YnI'(IqJG=E$^$Rغ8" HXZc?}Vī>NG:WP#8Gփځ`ӝø^O[Bԯ߼{]ju?MϡlrEGy icguv"a*&1G>/-5Ҹt2_;G 1tf<ЙgtA?^^ۯ?{u$,I/4|63AO?&BV;A `M luts>H#a9EgКc_Yv7 HEhwCRo._[b 4!5MѵQ*!K|,MRH,Sba44D% 0f Q238YjDDh̞ЈZY GU[L2j1l_3 >q0+EvBNe>dGj1 Ӽv1%QmH84q]YJD3Gj7fY=R{v.~$) [P^>`j?-l"*N^9| [ ND+.<#eXzm}@r[ VSN`?JJO@4 SX}7 ʟ#S1Zw>->8=J&ӻSX܇駼0L")^HMeI_OFB(ժ?osd @K 5"%HUa3,g!F5: C@v˶X?D8I[ .m%m !`L(XzYbޖ3:$@6!ĞC0Vlhga4~fۻlkXqluDAtՠJ2a5WQm^}៞ϔTe˫wT?w.$Eq>彞OlfocIpf.O&~}K^}Zư%0}I-PfR`(W , H3=a$$bi0K$Y"/B}LhRǨ!&a;#A`'E݃UaԧnŸ:p :\, 6S4fR Zn~ LÂR4q"X 2 1c3vUZ8S%`pfvnӐ8&i"L_ɿR{F]רPZg/V#s|7z9-@}eץamAPRI__yA/\pe• W^X )ih,Z9e8D0Θ~N!M!xߘr)J $)PڔmsypinʔsNʂqy B7va"]ܝh6t3N9߽M>荟_5'=#WXxsg-_㕇s;U5I\΄7U\ 2M4x^L3𙗜Jq ؽ s) ~h՟li?)U$=k՚T;EvWHRc :l߶w֪;Ăk=g"Lj_Om+wְIl.N>XF5sM0`Y'O<޽V۲Ep w;o`m4紇c8x). c+4⏃EKrd#c^%z'Nڪ}h=qhqN6W%r>9؇VZFT]#z_c'p}-޾]؀*D%t;=9|;d89@Jgc(Sꉓ8)EԙI5POӲ'`,zh$w5'R7qe&:eoǃqvO,v8@?_4Iy, wn+NGƓ,v)[E j@Ac rS]>IS"i/ٻ|?5RHT/pio'n[P(ɥbɒXO;J"iG_s 0 !$&@jPYD. 2`gp- [9G)}-BLi)Жʹr!SL YlҬ7YL_x͛ z?Q1~_MD\~7/w7/p(:B#^e+;ϰ1^iW~":IHQf"MYroYx1P q"2Nse$>=wyj#Bn5 X]} H0Ns͓(f7|Z$y$=yƚw^KP.jεt5NGnbzk8TqXJpEdlÅ"/FmFZS5Bw s{`ysK!78ٔ/RFU(xqE}h1c_~^{"#8Mo&ў` A6_BSԹY-YOr 8X gZ->@yp[ƖEcˢeزmÆщ?Vy$mzŦk`8 )$,5H1@JB*Ίq4ٜ 4˃47X6Sf-A *7cرm4`7:ywIA?<|;Ch=~ϥѶ*+~Սn{9˖(c?z225N[Q/ 2Pۃ@C-'Tix1Χ,cu{ Ęh=|o($ۥj܏V|q48;l %Qv.CRd idHZ}-?} ,&ōFC/7R JBKKh'kV]>Ft+m|KѷҲUL ^[tzn|~Cҽ. bYƇ؁ +Rn׎U{ggٛfMpr b2EX)Otv00KJ߮IoUl22Ahi,n.n11l(`, a ,rgEKSˍ7L šTLlCNj+u ؙP oW u^t2Nw g+ dә #bGY\=EvyuS<,@}{tIXUL]9ugÆw0%0;pJR+$Fё%;|#k!Y+:5ܙÇ?%\ pGD,mb]g@79d?mGb‰ 'w҇Á\)a! t9ARufK,Q|648#.y~Ytq⬊x:OK+aziE6;YFNBsrqPc[ɵZU;DPKqJ/p3ŏY|5= %2O4PhoɵK@ NT)d׷pE djɀ[\Af % bE.'jUW")$?#)9<$hsEӎQ3G~rxYxey+?qz<RE2CNYUl$κvu\⺫{{v( mbѳT QE]æXN֓#фt3fhMP="1"== x>BvA?'ntsT m']9+utsq1Hvn͉W !dWMa'J_?y1TcG>E $mOG)A Qrɕ70GyDP趽V׆% ﺁhqpM)-M61Hv[xƒ2ҹR/|ܴ1 0Y^ UWMњ /kNrp C+">!^# 1 9A4Z}1@äwzw5Кju<°H)+T} D[`-M8 磹 9 x?BㅆLQlZh% =bkpN#GFa㈞/)lTn͕6S߹ܶ $ ӝuwutITq%[L}M72zu"Yǿ0Vߤ+]=ATC0H{cŧt?Qڂ@,3C5Dȷ̲ ?SԒIunPx]ӯfFOt"jp_P"Ϧ㶘$JE`W+4%LqIՒ/8LL@$<ݾ3$I}=1Kg2E; `t[F-M`^)8%{ۃᆜ])A 6T#F;2hB\2_ 'xw*1C.9ГfoY|ocۯh)B]y/3s@ϔ0Ӊ)t*3'6j_{ ]aw;@J Tacj> KəW7"`p#($$NX-ў b>7 BIzBVf-ISbDNEhr`96w V芻@&@y奿J'\[RcN*LBeEaKeF*$/o&tsk?܏ w. C L@ΘѬ-b)G !IE-8Z<2a (B0$[knLJ;iD^Cr@J} {I:~͔jԞWc2gRk߮n2"q,% `rJ#Be A<éMA*}/5觷-LW(sY5qNI B ht՟ʜn?oS{ _۩riNui NJ铟H׊=-EbRwӻ):hy9U;76Iʵ:Jƌx5Y.n-G4q+*ZO֌i/ ABvJB&B&F"/(6Y4x)E\ U.xy&C'8)U5TC.w'& ?]!E$^\9")ϴ#N0U \&{ًWk"ŬH'ihU|ǗQV[QZD{c2RryPū:4@/<30-S_~TƌS9ͬ  }7.G#!c:^[=@bw|aU>i>r%E 0#|J}eWSB%Y׹L^k Hj3R-Nup_h%F̌fgx_B 1g#4\MZJ+\)b@OFiN~n ~ucX 0Li&Si5K\\q/n^^<_rd0w5N#ı=U.ŋFQxhTxh \ZsFd]r |=ˈ)-6D]?Kד dAӎ^o"Gl`ޗ:|5{?r;?5{B/,}~yK>U9xbqRxP0;Is ,Yȕ96VL;d!,rw)x6qq(Z֙\?{WǑ Lyg=IZÀ|̾A)K4IoTwnv. 6̪8?dPor)^O20/~N aYMZ~z :+pEl__}w~9-3xqwp勾=tghdcp9I''i<<8 8`4X?yE7%IeЮy*c;3s Y*y7f 9ɴQϸjUs[[98x?V{լqTɓ昕׬L~5+kcu+̘ҌvGU ?|:q1S:항1סd YC7읤2tr n WD- 9΍w0NiȬ**o9zrnPWy`&󗴼jC1gcTsRĨhy F #4Gbo@  Ul\JYYt+ ׶(̜1I:+73 .j>dRfܖlW!%< QqDgd1g=Hs/DhF) yFZu h 5LJ*G$:ơ55E@"1<:Fe0nM:g)"m1hIy82!(F2j2pV[o"!|g[~47p9't^@l.Na"],|z~zM:(zj=ƈ]&ЂQ^sgPpJ F!T.kO(H$Gk]^>(O#l=P7|=F[d3 i #}%5WPsD)$GOҳD^j(1K2}Hl3^d7I>$ZȚ"} P A=GfBb;å2 {mmFF^O'e|}rsEY3S-eG }/{%Zjsۿ7\/^~5ӏ&ձ`Vu}B ϯikYݧ鯼Zs?gg.V|0%M$m@{㫿qA"+HqLw0ny =TEzL6*m|$3|Ђp|dA'47\1mf;1h3y w6>W%,Z/tQxlwD|7p餑1x cx9I/߬%/Km=X;_tW.p܇ ,z'ǥVMMRS"CG2:f-j2\e$1`?@_p{gMZHUjk{D2`q=olcp}4ƴ2_Xh+pn}s.@Lؕ[L(Z%"RT,i=d/t8悯l\5"䙲 T< Ȃ #EpI v|!dj v*9_y H^I%Tz,k h`zRXm :<[i4h[$B0zZb4}llh]-B%ޟ/~l[hG݂VygEh^C]ZMWt+E!] ƭ6ϟ1K'K^e:YruOGW? Z Пe=&Ē B 2O 0LA)dLTHΰ iƣa'Puf' p 6q 0z@7u(C PJh*Lv)Jt&r넷eБ'=*eI0E HlmY R&e6Jz0~skSK>~]j&`@}~zM"g{[5)F.G)r_I#X U(gch9ۃ1( 'Jz{IDOa,+HIMXҰu >dc%{m8:M~W†\F);'^usnqR9Nn&/YfY1t: DH2y st1"g̱:V2h\R/}?ޙԶ5f^:3lYR+ᑆ%7[4 XFG٢9v ;1c9|RXuȲ+¤Emْ8G2].GONl9EQ n8kY:8v분DB"_H@3duG1iN !V^\ELGndM&iAĔS$y^gu:aL{gV(T%` 윥zmr:9WɹOU}r'd{Ve띍|uLTLT\]NJ^i#m7$UJw~cShBNa\I Æiβ,&=&sRV*iaq.15G'd" ^+@H6sGtl-mF Zn"Ha__L/53C|u &XtRbBoZLdys瓌TҢbVM &Ig(Yg; Y)˦V/@pf^mI=5}x%*A8nu]*Ij\X>,! Lc}zB'[l2BLiC`6$3@V8.N5LEmRVl-;2G)&%Fۍ( bo{@F@(H) ugH-L(&i{{aBP.qR!xK 9[K2x k3Y2Gb '³.Jh0QEe{YwaXd<ժ4xrpLEBYAL$3&E2qIf%+!#z;V40k~$ވww~qh>:=\]oXYn)wk6S| Z*@$ D 84 2t@hF3bD5\'l ijq3prS_>_DP{`ӲL>u2*-8*v>ϑi<~- ) >6h17{*:Hl.A8%!<ވ)0"ШdA8REhƌ!Wp:l UFɃpڐr5aNw9(Yۧ1TN9EiDngyc-ΤDx-%8B a@~kbQޫ뵸F--"Š~$f-,WRa$t6 X 0K)'3 a}83e=:8} ,7^8tFZg0._>9lfgSNIwd4+):ߨ$rG1E{z-MG\jRp$!8"1Oiv6{@6t!}Wa'xL.|7 \ Xbyچ4:[A)&U^y`:h%T"Iyu}Mbֺ0~O<.lԂ6vN;i@PS+祑`m*Z6*,׹xVpelB(r\J@gk΃Jϸ Q Q>h_0M#]uT TV2|njtjRS!<-yf, h`|2X_orгۉe[{vV{v[PJh9{vAgw9zrbnPrL2et^Ī=nCǂF1I̭, 5vs9R{UFI*fy𾐃(B{ʘ#y>@tv|5 dx_Gtf~@쟤6=hw}Ȍdgq;[N~b2ΈQo ];3U -_2xơDBz:vZΊEgMBlk5: ;[܊EMB1~o7h.U)6VAо JQ%-jQ;hqRf)CP+ͧh{b.GO~>CPԠSjݮ ]=}(X;!EGSt9zr[h4@Q FLX{]y)ɝ7ۧ1=EU=OyQ4]$F.Ƌwg,O2*8ZP`:ǐ ehQxȞ]2F8&Rti! ꐎxBbxoH+7*,v2SfH  c&͋ވ4e)' PxL"S$@]}ݰ qv:}zxd9Rrć=PʊSv4u_x4se_<"B@`[ڷȾFoʭiC{.<%^pFXIͩ!q0ё:BQpF( p ߌ؅dg~yqUl `P׋kYDB(1]o2\2+{~8]-j)3}5WWJҕ/PLLOav7Dh 緿]p6ݴ="zS;nk[` UW5ʥ)Mpd毪,KU\J%ya L߂HBPot+|6R,hmF5B 9LAx!gTy+ 2[Qtc2MNfn{sVn#9F.h OVz'<95  /.#.80gE-+YwCh&]xm P[;[#I h3ůw>pPa+9߷ƾnjYǂn9*' Xo-vXk"!%.aDG8'D oD+=ÛupL3vmfI8/d>J5H> A%Ai1T4{s"8#s*X*I Tyg1/d;7,Q2sCԎRP:`B13Rɳ0@Y%D:-Mi VʔA4}d)rn!jpR0$hQIlBe##d 2K/sXY*$aIAFzda~Ot^=Vvk{K6o*z\}b0"7)j>4 ۻˋtG|uE/Gf; pLշ{f4).z#3lf<5<]y[B ovyl_(p| '~G{ǨaĿYSu@ślZ{R(t: 5Hiqv bH/&:g9KeSGJ4Q(j8a6(;ⱦ*2j gQ'bKV H*En% 3PjRgϣ6)R z+,+\u%>0}b>/>`O$S w. <爤<%у _zp ޾i8%:r{dD=_O-3iDEn)hbrXu؍Dc;$#Y*.őꍰ:aꤝ[fBK9r*ZzViw^=lfczMw\kfckYAfƛś_!S֘=ߩ7@eW]20QԂ)ō[Պ)tTx$TnK5`Xf8ZAЏ1X\wJ 7Uzi FC1 u>A\':yɃ.{t8 K[*ZE'(+"X 1S($XHZxkg`|,^]MB'5E{/zXؓC7 {p|/oO❽Pzso_~rXZ"t9 f;1{RцE*)@"['N#t`Z(4t P,uum;,ߺ Ѓ k;% ,Ǖ^Cyիlwr1.q v\5W$|lqOt7G}6y2Uz"zl0Қtt|8Ş(@t2aPbq經kwήO\4U\Z{&FP浥"ڔQRgЧ]8Tptp<*<RZ@`K=M;.Zk3IuEk&|kbm= #M/iH@":2RpBR yLD "4_q B%)=wʰ)]sZI׾ :4s9&;@">Dk8PQz#:Zĭd#I ,bgBL;XhJO)Q]|$|b?Vk=<{v9urf$M# B(LB{xz'Rr ›Ih|PzIhA)>\x$I5s9;#N5})Ts)IW?hd F8;yjjU*7;TR6>˴2#{HHiܹE>`žj0s܏˜\՞l ~x ׷ Ċ_ce_$̊_^y6?t~ަWtN~p6p ,{%V.چ$䅋hLI~nCqJOBbPEtrQGkJ֙vo\UQ5!!/\D)M-y{ RaCh;g_:]X))_q0M(Qk@ho75㥮ipG{o͆70MVp-Ίx8AY;AM7͂_bGdYPq9p\ڟe)—[YtM瘚VRvjvͰ^uݍ&2 1:hG$W4,kT!~]bh݂Vxd|hV(%X&1pK0X0&^,X|i ɮ^t͂BV>|̖1emi>ۥn}NZ;[6k hQ1&O7F=5-S>K+ oۙ/hwl9ߝ,u=;vv+nޝc(cxgއlD.ȋr"2uf=WLw6܈HJB30A pl{pՎ9/; u3%/@;9v.xDBPyv_#uɶTe d-M>" Qk懓,4cd(,vIDU2t> >"1Q&sQ2ǭXl3,(dmr.ЪdX0-e0 D#bCN`|ź l/QnڊyEHý@<$sV'A-dI^>ply %%CUm [ $F4c3q B#*mI~"n< !)͙娦[~0 z`t}Lf+Le1>od* +:t ?2?iVj3,*HZQNXt+ ry3>hEy)++Vu3/"dJ6cYsYpg6$n8 38dJ.[ |2)\պ7*h}Kj6GVP 4V21~PǾe}g׵~qf}w= $?\6m NV{(#N \#"lv%Zb~OhmJy=v[;[b\QGgXߊDbt;$ƽl Tȁc)o*:nh))0xS4'Pr<Ǭru&ȥ5Z rʈ(;Isf,vqU&X%)+̫ z L1KԵ,XZKZf k!6bRgPUc/>6T9PRI)b0 ( Y@ q F]/clĦ`gp Gǃ&dr ̑8"rA$ΑH:#1Vl>AgҬu+O{Y Z{%4bP-ڸg7)l\5 ]y]ScX4j=0{o_ BI2Bjӹak ڄ^ԮUJ  .2w}sѺ :: ]uWi Y `ҕVV2*zp + gW.ŧf ].مĀ E1)K:`9Iq%,'cQ׶i* WE{ew .-3@n'ew;~KU@soR$]nY[R4HBYv@0.n; [RZAL[%<#ti*DҭQܩ֦)D+, 1Mj*uQΑ7q&j:^JPU mRN%'oug(0]=S*AW[2C_ܲ] ezXlܷ] RTBپiքW7ז״i,Wv5k()|z"bI@mF #4AJͬLIx,Xps<| uJiN?H;ޛE0"[zkG;s,^yɺw~xx^0$h֛ٓۏx /{_?y䐇YL,[PˌтW)cssחpz ϘRX*ub[@34 1vHY-48c&XD$s2f0N*]$,Y"iEò#FNI@pA (BLc-f@p2|0}I|"u^HУwLma(T6lYԊВw1c BIutݳ}q1*.x9,ѢY;$KG*~] ƫ[6Ė3ጌD bdvlf\Yei9*p|{8d8Z6Nć{XD 2FG<Q #,!IȄLpI˅6{l* )_Np3%PĘ1jRR ;[ba\r[zS^*g%A{]}^Nu=&Qb*GN]%@ۆ[FNç?淛$LpT0%!(hc] 800Ka*0S>Y<=VZm!40uǾ2 ڂR Ybk3$$^dIO"˹y}=ze.ݢB26xK}ұIzn2yz2hr3&r߇^&dA-g-|tm>Dhr6NJEŴl1btbFap^O&ޢPe Mg'@Dz׋;36Sx\h}x뾌ݨ7nq7ɍ$J TsI89dVh*6ʰ9bi[,L ϧNM. lz:9?8 9&aH&积G#}z1f}gpSb~KymϣIt\şAz:sXw.>[iƿl~Z{pLռ |G;tqAh:ӿO>8P|O۫/W޽}c{E?]W_ާ|w˟?^ٟoWWoܧkXpiWU2d~\H^_0ͭp2og_ϟf؂aLj8z>\^Y|4p\ ^|Ì|`OAXc!9\R>e|t_Ǎ[+|ݼg>oڱY,bdzьԚA-vh-ՇؒMWE$ӂ?-/xzA\\AS)^`g&cw{r ,K3X?'9ݍ0yzLW'@'/K_z7۞g_%|3;7e=f [p5]'r7|)yCu^oӬ!P+_ ܧK6mkhX`7T]->z0hgZu>\d̷&O#i>a\d6[PxfE jED/CK鹭m6]\(-}wB}ϺFw;Fċ&ϴPFs4=`JdiyE?u>KbyQΥ 7Z@3.N Nk.pSCD(1Imb8N_8F_WQ:wOu=?J{ͣ>|l P,ɰHUvfRHGD}旽8 Mi"{;U!␡.rYlf5p;ְ8p"/aD46OPv^ǍcGǻ%^k5|^8'h#/ #^zdG!7Pv賺eQJM`F–eG.oYDL\Ȗa!&#-2aV!. ENK鐿oUıٲ|BRy@UN"PMd#;$ BZ¨sQ@ i(@oôD:?C~?2tܽK{f +^Z 3n|DT%J={8ԗ2~}ѧ_hy|EA{GO>v m@#_Β1Lle^!t)>33%& _නyu[ $f˧{O$1/|~[^WAuQ\.efN k蜨X;rlxٔVFr *v+)[q鏢UqK:]q.ƨN]\}GU:+*ҹҹ  ~2LQ͛ާy}Y 1??d.Y3s3o&Dpq^_9>"nKӵ4 Ltz:L 2yBM$ :?@V$ xo䯇|r)Ӝq=`( DO.#]p!>f(|5Չ&*!3Q-U=%Bx=A%NbR8NVJQgIC)<>?>NrVyI[Q1O8(auD|k[*|!w/pa ӈf&F-hH1X#A9!TM$uau*|:0&Aď:EWXނ9`1-q0 &AXQq2""XaJ0X0P vLi|`HdU`346&W >eNcBHa*l4@HDZ ' 4c] > d;yx6M˵2w.&so;=;TyM??X޴w핫4w5;bwM> eǁǗv["+)B[XkEKP%BD+M14|۽S}s BsRٻO%waVTrQؖ`]ÚQX3sL\]۰iiy[T+NłJ+Ac*@фK # :TXX~ ,zkSnc1CMcaa?,bQh0a|غ1  8ny)JQ0kjV{m"~u $5Wj+W0[T{8ny)Әb Ŕ:AUWKAe\=n"c$є69~H} ?  PY9VPNjرsƮ ) ,²:C}VZqצj r5~.K!yCGYhQSBD$W?A6<* +֮j( jsfc,5`+]xu!0_dĽ v0 4)ʸn >5}&붷2Ҝ Zw't#+b2F%}2䂵yv͹SVQb7c:d>_~Sg~Чel uz=K$k⾎DFr`K[u(XUi*J5.(zWQŌF-Q0> jyPُ(X~%N)ywNyǻOQ )hu};'a4Tu Epiu@!(HCN%h4g`ni6L}mhKo=}=x0Leݥnz{;n[yD{QVmHVK6\PZgGY,Q$\ %86:հvpp֨pDUL,Õ)%UkkÞSRꎬ 06>BbɽFC)lJv?k׃p($|wsniOwiv;MjuAꐀ$n[AT#% 6Qo!?5 E A* 1$=-Xr E&ߊSζ࿄?iq4GE=I2T).T枛\M6( dD{?J J@Gom%mo{|9܋ /Θ Fml͑/}ljVn2Hyod4GL[%z?T8KqJqJ#l>MűKd_<*y'Ñ‘;ZTBuGCNvBȳ%*IBQ:c 1Ps0lI=c:M!ͮ,3`; U !F9BhqgOF咪Vsϕc$j;&?7Bоh˄nPhB퇓J AmG&! %r<mGc-QZ٣eQ{S E.X@򭛾o~D{i0ڏvBR/*QGâ dh*dinU{r^ʬz+$[߃pϫ})Vr:rϲpp:U^1N#NI(M#E$f,f1a0e"X8fp"ajӣ:$!a2<*N5 i 4G@XՒVvGT1NSt :{w[{+L1I^Oty=1x?DaY6˃Og?>eaۿeOEҩKUĚ~Ig86}e`+q/>d..2NFVo{z}]ݜ7|iDy}7xm6, n|g7bgt)'|h7;79x6R_:{NfF㫐Z/˙ &Fuj *:9mA.mWOda] CR-0?_zҠ{>Q]taNӇ/yY}Ͱkf5?{۶Ar9 :u'-&0Q#K:=8F%ޔ|Q $g7;3;3tBFA%@T!SMY%,+kV9s_[Pz̀bZ/i"q`޻:6{a0):0FzJƿW{kPbK{|[Cˎ l2ّj)Rpkٙv毬]EXdu {O\WӃO?@gO݉ǂ)g^dB1h\6!HYL+5_tWl[-;K }=|HVX)!=_H߳\`(%cgyFٚc9G<2Hhhۃ&Ln(*>`hS&$ Ѻ('Q̸ev汇,rfbOxL>1ED#[b29dQ~Hdgu2^ޮ~wfL]|AsY2221xqUfdFFx;y>QvW[2~%GfyL~%d1Z ٻ8`WG͵y8{+0-?sJƿWŕ#DᇄPQtʑJr(\ .:>6:cGҺrC"ˏ\.rg%SXf&|m8fY,J|HbMlBQ]BCų-a"CUJ-YH 6[WWĀ8!4gompa<'+2x;&1U֗ O8tT)qܮkQ~j|Qmxy#R\;tV$+jrr˗?7j3F왝}^\M'] (/n~ըk`Ù@I A>X$Ċшke Q)p{Tl̸X]/q8UlկMns2ǟ]ܽ9,h"0Q  .aۗŸva72gL٦[d1DHK# EN8 1rF +$60M$tCmNeYl^KFncxu :(kΎLe~6CLS6CJkY$~AE#j \˘(UPJg%u q W[)8J-t4o._]G:5=fO+qd]Rf:UxL/|*;׵羸oS&JpJԄ\w8 "_0Aͻ=a{! /lF\_FC"Ua7$~ B-d+`>@RnTߋ%(ɪnA>f-^dLUaǠA̤'T={HE[f=皧%? ]@:O=((CsZ֌vTsg:号FQ-ukxSrgUw`gM #ۅys"\ UqNݽ KBHqޓ$Lf03l?l=8Ofӂ[^$&au$QPz lZx̹UD*֙oY\ orcQS`"-ϒςǦTAYR)Ccl UOEqu@G~[c PqZT;dhu'NRSJ؎u%͞2Bl?iat9 rfR_V(U">$8C,8f w ]e2hg;ht\?ZpeG};(T^Ɋ oXzc nkJX>.Y,GykB66䴽ʕ|6&eiYz)=3ÄqZD}{gfp6:8Y{x|)HG*ZA" "fX3B"5 A'[1}Hs9aU0vt1BkE+Gxg<$a齺nBvHF)"-9g~[ЮBٕ/>L"FbZ\l $&'*{޲>rau?ƟHL[&q| ΔJ{6zw=GX'g糥|'yR/t>[&ϺgyBg(uF%òϷ\ÒTHR9Js-"e]LhؒF%Q4M1KC1lDwa>uD ڂN*儔0)H&0da`P"Ð"< q PQyc&:*v _?XЃMɂ‚ =,` yr*$2E۸bO ):0ۈYrH0QevNϳX)"hX"ā iE1m"ZND6UiDž UYsѬ^vMadO64 ѢCcKS%,rCL&pz0?ͤcEG9UcRvP4݄ gfX"A% $Im{-Au'^oT7a<2 1ol<; =`Xg8BzYm`5.):D78Qkuj潅[0rq!¦V.8 [_;7}ۋǼzk܇N] my]Sfھ1-o֒i/쌆iwRQV`<*j ϴ:Ss0xpa4jvfdwT1 5joZ8! #|_MPOko;.plk~ i>6^(/rI|f߾3.P|8'/1b$G̟L.j&L.<~?l~q'ܹ#7Gk'Ʉ/0{6%7molh7['0Wg.6+}_KbwFM$:8blcńN%:xβ~\w O뫫~{͋ףn}2ίחη/|~Wf'7ƟaBǫ}s0w&0),\?o{z|ѽ'-c55H}}L{31O.qn=4=Ce˩rwr3 G-{+ ɯ_ o,x&+~,,lCNr)䈚V?Δc3Kt"wQ0LJbm8;0gO3h ?g+(ڱUӵgq)peB/`V ؝0,B#Sx>s£Ep\t@l1瑉0tJy0[ۻo7f%ć`+F,zN~i5CjnqT5qO D&L_}8^7~7{0l~M&~һaEpoFɅ;7`52f`dN~juLrهfQ/{a7 314?󥟦 ޅvn'v_;'7@M>_._OkGfO28\FB\O'{9h nZ@o'ԆxuQ##C9FU%}NxDeL#=fIi|I€Phséfykm8 PrOD͡Qc1ɠTÓT\C1 ڜ#:XM$}=vļ/ 98'V)Xj@N@Y1M:JT\4.E Gc:݇ѹMK5b,:Wt  [`:W&2GqF%UL* u[ I)g{HcIOyBSIc(gRZ M[X b\G;ij-~_O}"G?.~dks ? valuF?"|DDi)#yܣ;l&Т|FHJg8ZeS- lRn& %EpXlQԈ@glչ.lk[  o<B,-36k 1黲D#ߝBBXHXa4+ƅ @-XVI?1Rn]yնyuFqΡ^.Un$5v1'n7?gJWO~$D !yx޳OoF{ƭK6gO N_aCQcK$M?,.#q4jA#rkADdd|~t6_Qn3rH *^>=x8VDIi<yof.Wˋ+TANyMAXd.dc(+ؾbC' Nyt\(uI.t21B*N*_owZ|y[rJһR ah֯ ޿|吷puDtsW5VSVγ:էQ8bZK*[dՎ$viUXbm8,{ZP z3)߫тT~y(sYG/V'nkZef <9r%M FuFwRVUZeg{8@ΣgmdžF(BO:*c*l_I(c\{Tc=p4ŵ9G*&4I{IkΰR0\QN- 7``aJ[{ˬDIƋs/'vkgaF#T{jc{ aHkjd^̒VTkn姷c.:oYUSbSbSbSȊM!e.x;US]=b+y ;3N@RqpB b9N7]fٚ𖙒R\'Q<. B Yߌ[nvn6g6.~ua/q9MWDu,Ĝaa]#:mjVLD@~]pjλhu; qB!˘.ALTRrž^цuxo-i6Xb`\&($S:Yc#9X¨lgJ aZRKK;ywxw~(3Hea~p$)|۽is_+bƼ^%oR/`ƻj-Iо$w@-D~}V\6k3.ⲸlCw8}Ob'.G bN// FE|8kP WeqM٫ίxPx1HfN(Tgf#LשtT{u^($QhLfX]zry׽ׅĒs߂3׌4&"fUc"Z7퉮1f )CAZu' |8 ,qEu2( ']Ոhtx]̂ ml2֣,t6D2\\LekPJsU\:,NsJNi)^#qcq`-PsKq[WԽJ0.䲇Mozd^hT PQ (`K{h%q9H:;g9|g:]LSXhAn{5G.$Ҥ=#'*hk $%; | 0,`]MsS(UI"飳9D9D9i^m-B0ץ!od^22HjwF#rDX6`!;>r[:Fx'298^kI1 9&FLEן p9 1>&r_n#CpRdUfa)@r 1j6y@P9$roo?LoY(<~99=`/|'ąKx\%MzQ\Ք2 U.!- JHz2S}(G^F}=^d232@J! */ZLh} =.\wS7Pdr j=ZKn1r dt9d8ITX2hT!!zTAf_>l "1+Q,<3?8 !* { BkyI;LGI\.~퇨~. r?TK`g Bǜ $Oo? T r"h[ݹ52F5*m}0*HҺٺ3ξ䵌IdGyMS2TZ`E;nU7)`k5HdL7Ի8`0^gv’aJo a,I Fу&-j|$jJXxl>[r"Hvb]B_Z"Im/Hp9l\D.ŭY'/!U27AF9N\ %_wj]-z%gryt%Qud4^i\ xz3t07}:/hP'/{ ZmH#@+˱0Ʋ0@ɭ0f^~M<4s&h˧La16//;vD2^8 rM3fSD1YNj&kE?U8p0+ Q<8tZk|:zLWHAx]d1e0]3s$[%[Tm4C>"n Q_37,πF u 6]}_~ϗt,8Z(8D3ojWFJ^5OdzEmz'Jt7kTe)%fإwNsŇHZh I(C(33Uk 9_lmTHy 1 mS*vs~Gd<O"ZFןQߙ'ZHUmpy[}{"k}7i(&P>l&jcbګsCH"oA> 9C ~ B["Nn j7^cF_Såȹ ,pDu"˭ijIj4'cd>Coz)pC NiH?.ZbK˔T" qXă"nj\B8X4g\CpޖLM 6s b`9E ],»(0F; Ǵs q7 ə LIqpBZr !S^X[|N0T͗Fof/n5~_N nSXKۧ+AVHDၿ* 2 V+!Xr- `T%D=jL _FB ڐċ3A)U]$)ݶP-n-3ñ|?sW'K']Ip(`.>eE~$@݇Ɨlt7%9kf^YvG NdTmhX[ˬa(A&^bm|I%c1AZh8<.Nلv~^c$pHtz%T&KvZ6^:$6fLYrbs[)r`$Բw{bO=E‡+PIMĒDϡ L SqSma(͐ɂ萅jI,lnz.52- /eWJEçï~fg!}Qza>EA׳* >/C)*,45^v>il;jh+f+!hڛ J*p5 O"ژAiɛ>a;ƒ_JɒھY-CZ!۶XQ6/c2짳Nڷ)tҸ]1g>ớN Exޓ)pp'y4r#s%I(>4Op&b8}63` J'3_O)< {c،Y 6Y<Ɛ1OBI,#N J &)nJK@lʐhӌxH4хNm)=vH"PRoR>Ytթ-;v6zPDZYzšwI}GX Ed( $\wD=g<~<A8t)Sv$SY!dw.)VބNEO7'*w0,*|sѐ\5F}E庣IQ* 1"5C[KFs3m@)p;MC–Ɗ~Հ`ܵ"XBFQ4N&UϧVcMLsՅ";i VQh@0ݙVߞ6:;h *lJP2SY4)`,^]RSi%uo㗨E |+gy^ r7ƍ q#mq#Ȫhѓ(1 Fa /I@Sa3ͥfQ[l ~7q݆ѠnEo t`נ܌/?BU摫]鍘U"x;xK1aCP`c]䶥*a5C3Q4%]P2ʔcٜJl<1vy@Z"\ lWA۝r=Wh/k7܏'>n{A[SrgL!Y|HԗexTlG1%(&4UZ>WJ玠=w#h/]hdQd(bCXH%iaT$7#Da!"M)$ ."dxT*p@uGb $,F0P/9seJq"Nw3_GFFIZgu+"#W;ai [-`q"B[N)œS+;6㒗žT nStVOxG: QK'wÛ-8 z&`Y1i\RLTS KF]O柵X4z ~r)"Y!A ؎aΓ pct.ua:@c0($:? AYu3ܞ/%yB9SZ|ht @TyS ;'xs./x3il|5dF "\> 5}%)M1HIf#(^9\֕[GWrX]ְ4IJx.O ^ku[v$1Ktq 5]`T:Νx; :eӲb?^4%WMrmq)ƂHr_y=0O8+H8;GI< g/7+ey*0e [˶"3_-r9E8888 bۂf% Z'=L <"1|Oa~"(\2{\˫n\˫ESfdTUr Dך 3J<ݻs4 ?*Ra嬦ƹv\{Hāx|~4{lN*q"< o<{OkQ'[ێ|-2@K&Qs\͊cփZUEk}+Dw`鯑XSA,cF`L`p(2 oǨi,XhUDJP͸ԸOT TWA;W$yN-' v+JiI{82Pw(kCZ0I߃m/*:{pXY4VEg*$E2;- >V31S3ﰣ4WS˸ب[ssޤ7&)A6 IΔޓ'd**2gd"S΍7C?̍^zDa{H"=m`c+4u ~/ TBY#|+ࠑQVEE~rqfҠ-0n7&՜^q$*N. c*R1!`Ӄ.>#5Dd>s Řv.bt}ӦЌ|.u,']o§]R(g~Mcq.3*% 7\/z8 $?O?;q 13zݹ3N/;;n*d۞,yy$Ԡ3CeVlZe:YG'i]i(ťI| ~>t1ޝ3iE"h;rxn_tۊZ>.O9 J}Cp2X5vs/}Pؽ4{QlA;q{iFFU Jfu#ZrDvo[rΡ\F: 3fT",5o %4BXp^-㊒8^]'ZTnS6HI䇸UT+? 1- 5Y0CGV0I %Uk Q2=12tt3Ҕ}v v ;G*̘>{?!S=rRr F|q(QŅS hpNq3wuȼӧ)N{a KnUtVwgAtWwdw]]*uyRdSKa}qa<e˭2ImfDr< ӂ ْm渍3&nLE[2y八(dԿ.h+)ղ;8)jAFJy @g q6֫׃/^mABI-deIO:vPܜZǺ;nZ#_6RN%T&UGzq2r;gE~Ȕn`%?2K<]ѵ@W0?Švyd=1UJ}Ae8_o'+0hI=OQ(\5#A=[HPYE Q>HY(L{ ~ hvo.2Y]+:&ZU %*r蟮͕ͦ8X0o7n\8sʉM/7)Iӻ[|kyxB|(E iWʋvD.-Ɨ3iC\^a0&yv޸QUC>~n}.l9bNmgҭgk5ՁmKϞ_^&N^~g+гx={']p]Uw3v#IXó4§̞?ĶQ&蘹x.濘õLa_n4o/RMv]'//ξ{i;5/_<xWO_og.>v~{xwO :U}~~JF٥r[z-peB/`ހ+9ZaX۠39JuQx{ ={].xzg^zJF%eu| ZF|h:oζ-zN~Oѯ} &nzGc N f`zL? a/?Лѕ 3\>on?gzip_Fم_7Q) Lr~p ,ٜrZ3eyAФo$O t$|0x:׾"}~V7?>߯p md84}V^@߇7ԆtJu}iG[q,.*yf4Q\,`w>#S&=60@A}2Un3ݶF-3$˚E3rm̎apY+=m;mE|2+ o UrSd<$Q0۵pX2A$E^ /C=`J יr6"^jɨG9g;t:-Uȣ.kˀ,H;H>[L+8|P-RU~I&јx ڦXu'_T0`Kvs=WG Xg8!Jrr5.iwi3JW^+q"aYG0ucQAwu@@EP9[ᰱ(h8tM8Hqϔ2lP"Wr~$1Hp :8SVp\aK[O锤 I*XLK<9gnl(5 /;TS΀#dW'#JÌF8;YL&ݴ, :b<ñޟEz_ !o* VP+0Bɦ=Z,JRژͧ9p6ꎱ-m( mNǢ\M)tSRI*3CP}npV k-$\!e)5}DF3hG5' 6hܸ5 &ߞ6yez&cZcX5LrBwJ"qk+j6iPH\XO>M$CD!z.43~Ӵ&-h&'m_[ML"x|n)8aE{ڒI;*Nshzru7ww;0cML$"PABCÀY$BŊqRFc/~=%V++q/۲w.ȲO?=kwݓ䫯3~y2t>4RҏB"B#.!B)IDmpC2tukngd[{I{ݺm?hB_ !aN$ﻋ=_6HuT9"bВaX4y]=]Y;fJm}78V#гp#>F#) QSBNqM%F`x zePL&Q'8Էd-i+-A(V`ˈ2"x! afPd z{B zt֋/=0m qC)GYN`<;eUdFJE$!ހFbY@+A4n?={ qL5U_ n|dk 8FS.^5AU{ՂnJCT !Bɜ(9zC^幑CX0#&!m`\Cn¸Ɣ Q0(dHO]*!ƍԪ$b C#))C x J`h!XpBHM~yY%b=2{ SԺD>:J* CUmoYУ 'C I,N)F.pwgJ-gG$:%XRgcI$fgװQaͮy~;`< _@5%LfXĈl<Ke8q k<4x<*pb(|oǿnO ۨJl`<,lbnz-iK{`oǮWѶΫ4g %diݢ*H! GbȄ 3혊9v0+\!:TctRѪ9r4;#-nMg׎D:K)](E`Gd=~fi@_%TΜsr.KCvMʬSk p[G^C|/h9qN"zWIqN'C-%[HFb@R" 3L,TS7t/7,¼ůox_j '-.^d냪tA2פ~M7\i1‚p5{: }Tǜ$,`ieYO,ݿdP L=i$&KWGKdp1,F$X&yVK]"#G)Q'i*1#>1IHazTe0Gsq.#w8p9xjO1Jy.KهQ؈T1Q0ݲ6X`E*w AGNu<}<]$v"©!xc54'=͔EJ < )e6Sݎj;|>Xbe|em;*j#2] 63: N{nEsQY 0鲦gP/ JUa;N{+BLVPQ_SM-$ YR"VUYDzFЀgZ ͊3TVj:<'ud s]ؖ.ՌO5J0]/W܆_m7g"|;*PYqCA[}+& ҭ>bKj ä-kh5nlOʹ}D 5^v5W*e}>C<xO|G U?Rv]#I5ߏLv.v=|wqsu͎UE7[PA?ݹxNTը&?g]9<~D3p.x_tm[ *.**"ۈci \#, .PQI1`aCKxA%m'$6G;`u0">M? MWU8]jэ\j8m]B . ׇtC|j5 -!'ӳ.*&pݩm5VDDd""F( pէdjc`6UhT ST(Ǣ_*>3gPڶ󮵥A׎ϱ؞Lr`9QӲ՟Nc?M KͰ*pդ[>t%#_~FR*vq+x7_,`0K#k$A~+DʫatxNa{7}0Dק/1́_ c%yzq? k/Χ=K$E_=o$)I\Eŀa\xNy!N!s_&'n| *W)jCMM:y7 %5bc:uHJ0̻/ݙݚnl{ލa܋ح.9S*팹Tt;nMX 7Z6uNnwLbe:5θFGyڵs[Mdkvc`Ios<9=:ET}ĢR"PSe+*ޢs"fY/'O8Jtô ~6B |aJ/$Y݊a@nܚ|O?71.Lg_`X!q얛Cd9 p\ZϡꢗpO#*3s`a܃w) ξD9+VM˷Sy&<(pj]C <50,nUDBv* o pJI͘6$}w ,cm*ј*]n8xg2 yhmC Ґ{bG1ANs$ +%^s{SEEVLk\0= R֤" Ҵ&-hY8ݝZ>aSͯ9>󎊛u}4t?W 5?^`T_|JRV˗ }dЕ5\ hiKU1vXk^Qj)7GTJuR_ s$})ڮP5S{1&T5@ o;__Wth ۯ2ST9"H)vb-Pɂs, nRCXq70_G|1779tY· /@;&ؚP(^qM(!BA+TIyRF8 di 3wL>_I~> DGPs&"ٓ~YĴB'/d᥃"sqI* Թ`Wl\DVѢXXRtFTŌʜ嗕Y>ݹhL oYw9aBNm1!WPJ~$&.vI uLE_5h-PAの IγX**GoPϢeV ,\rIM:a4 eT(cJ$V,DւL!APinK^5O҇~7^~Z/To6UjD*! z5̗+"7$tvzTM"[BZ*|w擙7ߤ}o}0T@Q5CcLY͇ ;PIrkZdn>aYș :n%Y7Œ ")z,xn!3 B Fݲ}47Xq t+DBRǼpgDF T8SXVat:LM5Tm9} ItCrEF8ٿ]%fIƦ"d3Og%&Oӹ߀҄$9g~h ܳ#tљ W\9Omٸ q8kt5V},%"Ś%| 㰟_ŏÉM-*<X`-ysawŔP@tڙ&z&$h-ܿ0{KҼ!j8$#!Xk?-yN3%9}CogJPϣy aGXڡ\bzVhT੆dE· EL!eonl9!m4$]9+UU%\Ri M98E)3W,_ԻEdX',Pf] K^|$p̔I'װ.6^2(pKQEMD~<ɏɁ>.hѱZS>p]ۗ` KbmnqB} sX4?x1X jN $Gxrlz?@Zc6q@ }]k`@PWg9}\9jWZv7 Y9 yVNl:=;)ை;p^Usx+ޅ?FmDvҩl[kEx,c꠽枭0#Jߌ)@^Iо|GԉdBSк9Lxqtvw-8Kv$O͞ѳU rF (5^0ۄ,-,v+j bգUaLR%J/_ k.N3:ke_; *gD|tn]޷jkkǿe\11CGt+*&Hv'8ɚtU$$c\7, ՀcU*XwrǼr VBuެ#s=Ezo&c[7T2ਔ9̬kMDlR)Og^/dsl1q/q0iYude{g`+:>7>0Ƈaޝq0O=&8C?GI{]xo&୏ 5wqPE`Is0Bm6]AYsұSv:mX 0>q֥߿*%[ ɒ́C'SKEg]a?\ @qD!+5s! 2.c7kE8f-UhQ2 4ņGNDxT+<9q[jǸZ :Z兿YF6_ cm0K7{'0 P`<0C|$k؉tRR#;J,BY)4%q0S@NpA"lI&*-v z /8_m$5Z(Q) 9KD> Q!AlH,ђ#ц{"w3N&<sYHzom=fmڄL^rw[z.DQR@0!-A%Zl`r6j,M  py(p by=73EH?3> GfOqaG3jL C/^}1W}j1tYǫO\hf[`IDpė=fI-6y5l#&ԁˣm#] >5)\DNKA@J1' ؗO e谢=>K Gh)JhH&]s^hs]zS`sl۝ 立 ࿋1ya{x?O>~x|¹|9$1Џi|K z`jQܕFpQźFl{ZU@ܤ%V@@>>bBFb%Xw`tiI( ADRò#.d;y B  RE3,߯/w3_$#d25#e!WBin-oߞg*vK޾ɴx<食a,w/g@x|'zF0Ɠx?6}G^ݻ{gLw _N@ŒPeIr)9SXS..z?M+ئabHdkLܧwك~wӠ~l B].EgTjyѕ.Ebo;֎/9,mq8 ;8uxZS;ӔӁ1ˤ]&c|pPs>JVhӉv}D]ۅ"QA4&b.x>PT Fb`zp\]&ϋcH6X,R&UJs=m eg{~PI)̤ME-SxE3zE da`QFu#`c& '+2ЊԂ*׿/bIAhtA]aJQC!'S5r_4e@R% W!M&Օn3JO =-)'"j.n绵guH䘣n-J[˨ *nR3|A0(Y߇6#JiJ*@,aGhp@ۨ[$hB;N |Nӓʠ6ʟ&{_3NO)wRp8u18!*` ow{n0fDܴZTk/כ7-=pb@qv:Dq^ߙaN63 WQzgi9 tYhɈo @)aQLMN!Lzf< @8kIpNz#MH)(mٝ.`TDdrDkN4j/'\!1 f\8E>dn98#0NKps8Oa~Hӑ@C(9NjY{o{+`D}a~0.Y&@Et?6$涘܎ FeIO L))4 A@VF^&g-əacE׶7-D=ɕx)D'S{wOu^˵N {-'^ 6į;<I'3G7Ч v}r_3ۡ'u+lN1@cl`P Dӎg`b]}6*A贈Y %U۝˾l-d`k&ر{IڣF?k@B1l)t@=s "bhrd)6Z"a1ǂ)0"d\*($p>Sf++Ղ+V-W9́kWVcȇ_˰xbB^DҿP Y+Y$/`^ I^Wt] EI˽QSQ`(VPSEXX`= {Jav$]?zBp/݇,1us4.eo괋Y/{o7[Kov W6b&,MǓ]G;ac#Dc5jcfSVPA{;`gl9MwὙ|>6Ί)ݵmG+K.7аM(ϑhY"kk"W2r{f2v9J]rrUoٔk`SѰM%Hj@=$;DJ4WNK)s#j*:p}-i5_d6?=sq崭n@0\6Va&ax _crPo`DDC/5h\pIBc(y}^KC&ťN*m%!3pmK/ ly4$HOf^:0x5-$w/)rRK*JNۢlZT()zRbGI9qf @B@zbNX |=G# /2:H$ yl^U& /kx ;&+?N0̯GׁKyMI]:/4i9E!yjOΥ p+ ΐ,T96M3e%> DgjˆZ=MTK(:cX(1P[9[y8-acj'4FaeԒqd,8 xN,D#˗V40RS^A'g/T J|ǎbP2bJ_?? gAL!Ι9n2yw5y2\qwgwgGS~6 eWs61Ʊ>,x"wqBocv_a:J?V OFrZn@ߚrȏ1m*R6!Mp t|m).[F夃zp Pb.nXȩ4O &ɨY)P5y37ˡ07˱<byƾJ[L4ysǎ$ĖjF 儰 d1&v1vic * [,XAOL ($XeXm@°҄LA2e#T2)7+ ;k_vN!8|[-}vY1=B=5F֥wC͈WZdD` Q,i::0Lx /@ffR BcLl`r`kA)͙654 m(0t ĥ[mFl2~,Kf,nć#1'/\OZ`o01v P:#@IM”/Қoy Cl*&/i"yI@]0 }z4!}7=٢y]oѕwZ~OLBb^C@~aoGolukE{ᑥ@.?%wI֭YktI}Ըgm&b@Cow7u\MFC58 Ti 6;F ĒJh9 w5 9 GVa ?.W)=bϦWɨl'~uQ\W Lwu*TR *upܳBs?e0Tlez( PNC9} Wƍ=Qu:76 Ɏm Πk1qH5 Rpk/]L;;4bk={&A@t|_wHD6d&yLtg~f GTQt*k N(ah pul  i) 6&Δw pk6DxLULd ^`GSM 4XǓqj Cb\ R~Q'1S䩄%͞hjb&$*un^ EtrǨNh mwv;Pݚo\Dsd s\IyII$1/n)]0Pu>ỹSE䇅V^Ԁ@.%?pFj5T}±=FzRb?^҃cD;[8Xْ" NÖ|"ŪؔobӚo;bI/w鬈Ad xacuJo'Qz$ap]0xX &5z߳ zk(Ʀ<)K&cyPJEyCK*i{KB[m!Ҝ|?5c֑L;r#|eF@,l ⋐'qCHZ1hMƇ⛏bWaNlW_u%L.TeVRf#!K)lq!i%r9V&  ?Xz> #K$]C)v=limPms>ycɇח'mu*-qojf!Rނ\s_OGCV?MgϪ?)/2|e&Mᛲ /{͸RZ `UrG(5ύ9K %R2 # -`3殹 rz;Xf*sE,;Ywh?zq09;8P9oY}) vGdߧ!5/[}<3k034P!i8X塀G%V @ôH)j˥&F-D"f|̬h4 bZi7͠Sp)%/e{BH}?'NmQ3vl߿dLӚ^\\`N.{;'K?[PhЎF"T( `%g:E/g Jw,P(s-ziw\ԕxhذ01a6*8~ l˓v". _]ލ4#پ(M%h bob7?q8/:f rV?[PIO59A 84r0p@`J IJB!W^ Hb4㶛J =VlR1|秛xh&*^W.r&8u,|]rLyh2g٧{pId0_>>~x`:?]ҏ@E/ 3o~8Ζ"绫), x;ξ_FLxϏ}$pmQXXXa o"!A#ţɝpiqDEFBK_:D5 [ͩMuUzNU)0,7Vhc o2Hžyb S&Q"aa6uiaM89rٿ}|~sj4Z'V")QS kZ2e񯑽RXF e)2 `/SZy+[(BqYY~1Y AZ&t8 eJ) 9cQm|* JAcTe3^RCEQ0 'cuhZUĖ]?s~$>[]Ș+0h|Hu-)N8Sbg;F-MV,ϟ7E4y~lE=o .5?~{~`%f\6C @IW m.y .]H[?.S=F.aekNdK[0w Xp[um}+/ IoT,h֑5(HxˍAB9ka2Eb.?ͧd6UW*Fr'Tx Ás{e:q-wrl4@jU @RW=&ἠn^;aF:qD4ʱQCh*Z°sˍ[ƴD#A0od ̏ܖ؋uXbǒ0@ޕq$B˜vއ1؁˞yBȳMn1IEH&YI7 R7_d1 XPcZAmsX+\@p?bi7$үhB Ү^qE0q¤a,1:KԂ< 5̓!C2Ru*D`X&zTiEӅXb,Lqq1l2R,>Fu PdԷaKH ^v]Q+2Ѧ_YwAUm?TJSV5Y b;:f>_>|6썷KWC1tvr'[>lM@Vۗ)ԁ VQ3Q *ye6W*v2uBNREfr l2)qմUON}ksp$J`X\ jvyT>JkABb\\J)+`щ ǁ4,o&e [(){'t78` 0{[3PvUsepCʽ^hZRjbVB\.©'ͨw@ ` e*.h9**)BM. _Q)h1xY?h8k]y);&/j}&5.܋6\sZʯo~~Llx:IMәq B7W-b` W=7=6>)!?jt >[) Pw m-E``c K *+b*+hX X6-.^CC^WY},y_(q.ָDg!_3yjIRN o:n%e SUܴIs4k!njY⎖HD8W 1i!fZ,t {1l_9{1n4~5Ju^CrY䖻J5mqbsJ#TAfTjKMZ;]\*uϗIMhjզepB!Amis5UQ.4Ik8]B#S2R jUv2*UU?[IY2e?`iD ㋳J"UepB3Y,(Ev.fD#ݶީԘ$Z&Z%ż6 x;򞮆ìgy I5y6[ F,-L 0҆H}ʪļA BX{ itf Ym-h !}W+H#ӴCWL^[ 9v+yvދqd]]ڞx&N]jCaھ}fx)g|&&r˟D(H}f L"|]9J+}Atjrݴ/o KQ^13ExpE JiqKxݖz4DdP&x[82aEGy\{͆KDzW#c) I3fޚRFs-w4Ł`qD{=XL-KEkҷl2V-}ЩvM&(V(mm.4"ʁ4#R絊K邅=r5z땖4%[jN^<һϛȸKAclP%m* 3W\*Ci"Ā -Zj́*}_A \=}.Mjq.Nb%ͩYLjc>L`+A)%$/U & a^hp%!pT*dZhgL;lK G!mBo1=֜6ߣHXR`SKtgڡHHњ"YR$.+I.eq,EnVmlrm~VL+& ewQ\`Yyb˩\ĮPkNc+SSYp7zW r=|ZZv=]n02㛿oiSx(Q̸trι< %V+ĭ#TVF=}9Ť՜9$(nCM[YVt &c;(CyeqɾY:]Y瑃1~ .3WjΞڳ!jkI&U$ IseBa^%l7Ovo7Hhl?$!^OU ⸔>~y} ,(]#3o û_{OGFyÉ9nާ ,,@J|AbS?Iel. {c!t9O>|p|P>ތV A_hn:l[ f"m,XhӮ$2Fǯ'B9fkBxsPAC*`3cr;i9Ӟ[u NBg[xJ弣|?Da뽗3wTZ"{{G{bšQB+配;;1Y禅dtk;5,-wv򯻨*񲪺7CYo }18I#_PoƷYr;  ȉ&?)秶SՏvc n(Ŗ2+!ظ}:,#ig`!^W+^@9 cWUtcL}Da4k27-*ڃ9JG ^Yx)"K!D GO,Aӿӗ^[B1ho7͏O`HŚPJHV>|}C"rDޯph#B~LCû_߇IόS45(aҏ}7KO@pR9%}!sPJ{gtNzJTi8u\*uLjY-ǞgQőZ$6 UY0V嬂 LM՗*=ʹaNəӰ+JajT*0!lKݾWRKɍta4)*v+#w&(ɭH{.1֣[Oi+J1i.T CSp'Kc}Mu5pEDz=uç!j}=WiTfX)?XR58`P!y!Zh͜aјSڱdKJԡIT/֠TmIcTmS+Rr^*z~" hC:yvQKX3((u JѮ3Ʒ)qwmPBB4CeZ[XF1ܪJe.漮 .mplo?m;/twɉt҂J l܍@(u)4x==qʭ,q'gJ~qIte,!UlfoldegIaLzEZ0ʆ]sqfK}:xx@|lqyuKvᇠK~Z%qhrC- GZ͂1AG^U$bQN Di::)^+nBXV*AbL(Yp 1F\s\&hmɟ3-A} e^7X\c %mk^hK`Uv̯nML5]q.ݼܠ _x_zDm>'t5i Ǐ1fro{<1qYB}X]* zXjZ +\@~)IH;fSP=YqFXR ~<")ko#(@ڔ324a3NxW Re)u$ `eHcqVnӍwm͍ V!qʎ]^/>*\%%$%IIC\8Wv"8|ht7n0.)nbJ*V[Pɽqa^*:|r>_^?Tn^6]"|az>"LAYS Z]3^8|?f4jǷ0yO=LeQk.|w^| lh٣)PP`"U-R2-1b"'/:m,T{M,wmin>fUZ2uvVKUHR<ѵh*U͗qQfʎ吐ސbgb[Ĉ4X5qF29|L%hhfa;Q-4?ֶUQ8mP}+łp:τYN/_dy$8dj#%i9!̼\,\"zT|Sz2yN]|ݳB6@s|x8^ ܹH;p"\ w`eR> jJ 00Fq5c# !(!Bku !c zݻ}6"QHp2EF]q?d".H&l"6Q1KFZ0'5jIfvnDfo~Gs@6ax>\BO/%@F2u|,rU_3'x,ZŽC 'AZ0,]K9,N4[ujcG $Peb{RsnHdtWu65t2YVɢ|T=xI(QzdɉqpDļM|kAK x3DΩ+1"ZC6Jy*Y:Bb8.$dPM )Gw4$2@pLG{fTIHLܽ>YS(R\";=`[pDЎ0fl`]nߦR`R| ߘǵ ^k&ѷKWKc6ND8%c S viͭ;OlƮ3*Y* XmVWe],yW%F!jft ,ZEܵ=u|m~nRN4S^婖ְ{{كFRDŽdɐۈ n.̥ #!&Lr_n, RO6b&XDRr=xזz {dQ3h6;J(ZɒE\K"Bpoʵ.$ǪTkV57Eʈ bD!ɘQh! C'(U 6f)e!{e T2`M&'taS$2=>77k{y¨_J|]_kz_Cǚ$F)Hmgb7}\BA7. EO%̵gJi2LKhHC^)i݃5E:1[M T;*Y| Rq֭[݁f[4䕫:)!kg%.ndfDR`<WSnp<A X( c`BK ?R G5L嗮·gf|6gW, gw|VbV(mm.4"m^2^lz3 b4Y2@T({(i5)`#J$(k% :9ؕ暙\5 HDDofr\32]r(&}2͔tL/$g"eG鶎[n #(#PE#o-0RiY=x KmЬ+@6= g՘ X&XNT52,(PU. c/JCw9 IܿZ`zb(zI%CѤVW Z OAB;9 f, *)<ϐp6!B#6jZBMr46L/S|&f2.hk·yWҫTfpJ3ߺ/cx<(_Dߴ 'Yy)sh4/~yDĹ ?O>λ6|MWܟӶuH$ ax2](7ĴC p\%v7A7B`\Ff2)<h^{(A>³6=HGȆ灵cX:nz&TQU e#H8 |?9D .z{ΣkNGt R)p) jkٶH"+OG\?%X[5Q-RPqIKw3`m# cH%QH{Ae%G~ LDpSQeeU^o gJh,ARZzxK}MXH"n] rHF뾣 ^Ihm L&4䕫h%~ǾFk f1YkbE\XH2-im@YАWcrE b^ 9Q@6C96\~K7s^S $$.-QWd!͡i<)(u<_~Жw D05kxW?!y_ڲI q,*xbT#.Zr{sېISNs'Hiۛ\I.Z9?!mȻM\A3&Gu{M _˸4=,H20omu1BD_h$_"+P y0!ӕ2PTnħZijIT_Q*m>Uk98!J F 0lj!Lvy1 eԣj|MM֢BMeBmV{4FpJ:f}§y l0;SwO/ͨFa$[A}G@}k$ƿ>> @A$JoKYgP|zw/ٜ֝ ss7t QDw=<\ɖunA;GC|܍$2%ޡT0N@Gn#*` #m,nGR|=N4WezXg}\VW-/+ 6Cx9(A׼.Lt_.Gk.%ҔPb4s'h;';_ѝLl'{bZ~ֵJS'tD."etTBFJ2je$C\FÐzY2 `x~oӇ,˚}O9d>؜YZZd[mR)3FD2a5Lz,2Jd š1wEej+l{db;au|$MkZZB_˼'Hi+SC.j3*zQJB*AH툜w,ZeMN~'0Z 뉯^+$j -5XDc#"5wئDoKG5"r#[wkC³g39Ex uUAy)9pc#xqi%r;5c64tR~#_khe3'cr\ ܊=^bMO /+ame<4UBQim#3P I-ŽPnWU"˙G)碚-. G0 1 ѤƚED]-,ʘ<вEï$#Ɵ7k j(8AC , ,t!]  )"*G!OC.nQ+ZZ:RZR-A|z\`,\)w)5xWH%\h,!0O{1e嗝3|cYNY(Tj=yi' o[hxs$f!O,ܗڲ$!i۲sѝ3p;G3L0\?xDmn =ܠE5=r竳t[]aDU߽y͏gnh|zz?WA\e_{@ P} :=i/DCV1SS'q~ULnMNkp[M9 2P'+1+R0I.oA8HT0MxLq+^=~|[K3N@3Zʧ)IRFi L fȿ#Z fX) p\qL5"KKJ"% 3}s!)RlJGTxb| i,/ָһivʷ cs`] RFǜі3औ 8zO !8+s@JXD2l\|!J3V&Rr(,c;NHH2m'ئ⁼zzCƸJǜ;;beP9mJY2y?0~N ~<`D`0-ea"EL{0wG3 \ep;j}xCKI?2[%P,l4-lT=+}T_aHvf G  [cOjmǎlɀ| }=ЏHkwc7vApޫޏYxK@?+n,Zԙn_O`+_1jMϫm/p$auIU[%ܚ oP*z bi,dCAJs3p> IR |س-J{ȣ좭L1fM ![sLU`qpײzAu-]]O*Q9p,T+gIZvE9`=EVԁkL<6)AO P]"j4`M K$!3)R)Tt,-EVDHyxFLRerbgQb2j.(Et3-Ι@$"rZ qO`mkl:uzXcd[zv]9Ti׫$2BIZxRHk w. -1 y@I-X恏2Z.Te V 7A+0i`Ga la7*A"ρ/ '^,NYGEI,: $ AZ, (,RNEq;e+@0b@(MB%%W d( K΂,GZ(M'K`W#zw{n+.<聿0zat^?ₕd1a"_WߟULGLՏkoaBD% oǗg:M'ܕ|t;X?D6ڀ`H<"q1qLZ]K=T W{ZK"Zw3hP:w)13?5Ԙd s*k 3DHy[~$ JoviYABFo??NE0 2 nz,j ^IkOWSZT w{a]%<-qڷ$ ΋h3Dtn'r83ocR$>pD ^:ôS 49Y6)M!1.VT2c&EFS9gOWƠ h17 a 逍j>!hu]g8,1̀<#Ǡ؏V}D㏠ HBƣ)oˆl]ZMD%|0Kan*؜mpslTL]oWܬӱØ-`Oft#8JDa.ذ$hH36~aDZaʹt36Lk& 0YqC9| 2ϑR-ߒ>8vkکVuW؊ϻ."JC@M):񠵤EpM4,J#g 2LxN]0)]~$ (O3 -?`fY Řѩtn`UT2Pgt[퀠[o>V;~P9Oa ވ;0j)3ATux-! Y`r*t3WtiÇA%Ӂ*{A%Ӂ*{)f@8_Khct*4Pbdb#zm5 % Muµˇ[9 s} tOg>#Oy4܀M'*|>G!(F Vq8y5}~Ҩ6Q ~EL+CqFV,q _8K8@n:q>4~υNY-To2pOp@AygSh /שf8=/oxT@Y΋Ht3k dBr?Ɠtkbٷn|7PgW~j~0*5cNyЋp?p:͹r +Kʮ|Rs6^X0.l7|sRUJ&,k~\X"\cIn"6Tc]isF+,}ț͊A+˩TeٛJ%."H(ZyRbϹýa:cэB˾Jg-ؒNɕYqX[8[|TNbUkqy;?G=O < PB\j'ANA1pX+G l_PfREWsAKR]FKjEhUrc-2᳤̕*gA%uzq?'뜶3z;mtſZwUӶฦQfs 4@ f7S;ytw迂\%<%Js,gj7O{d2ip ~Jr=:¨CV`vte*27_'BvyzI/mб7\-U4= ma t%98 hH f-,ْJTu+0Q0s@[zs?B|"~{́Gm߇9rze.v  1aMg!?2LHB$1J0^@ !͖2?G{M*#(>X}c0 =O;q( <|m|0/G|Hsԡ`et;5Oe.' ,K(Wδ; l \zc4# |w55F=ᕬ_HB٣6{^bp}#r`o^>㬑 g0j25_RZbsR=DZ)x"uRyzxՏ9x{w1n \ޣf!xDvU7EvYP@Ix#fgUrmH̝(yq'DE8*Q2m8T|MM!ՖTgRqA͍g ֊RS%CBp %$*[: 8$ %㼚lO7}EFˀ YʃA ' UJ3WfsG9Bl4S MHymyhkED҈4 \/FO%t3KgaI˹Sk.9lt\߫ܫӍk42 lEkRNF-kF7[ϐ^k9c뀇I{9 ֑k~7-$&u^d<?~0M Jk[^h(yZK}ۀuKwvZWW`Y+|[B j6O <Ӹ}yo:_6TK&i靝\Ӣ**ue1%YEyweyZ;|˜UH7.dJ&Xnf P }GíhNOUSU!!߸L]}W2TI r)B)vR#\٥󦘪@Bq-%Sd+MLj(vGin>9[2%LVs [8' \%cHsw-yVB!](BL9Vxi{d6)*xC*ׯ K*wOZn 8]Ƅ 8 ӋL@gLLz W"CAw4΂ÁOFV.I X27f'1ɻ`yq|ٷޗ&ePvŤ$|T//ۑb{0^8pD#ASC¸Rv,n޵,{vݏʹ#8)Nd+zDXx(K>lBm5X0ux_zYɞ͉,Ska7}BcqLCY<ұ!T~vEkHQXcR|WO06pg9wi8[@!2>sPlfJ靓:rPJdTf]TKil,mUa3J>UڝQ#Ѡ_9$Xea4j3/mʩ 0Rl5]{{= "xsHo- 59p|\?v S[dʅT dM |C霒&".x"hU|Яmyqslx3B϶ 'KUo 'n߾As-֊/yCm1[ /FneRզ)^Lu+Bk~zyl uAnqma34Sskp.4;fqޭU@Ub5XĒLjP}B̐kxВ^,FšeWrX_|: / hs"&;"9I%Z"ۊ: Im ak0^-6fQ:LA鼜+NX4gi!)2ǜpq0'W.{^?0z4MkEߞvHKyewcsʩ|>")ѻiӍwi3u_~<^u~p\QS3 .hoxdF!9lZcҧxUlCAX $ϳP5Ğ|Kz ϟ9, <ϸc"/>eiƱ|y";ge S-J;x2ZG`rc0)swwk{n]O*7KΝXb~ļ/Qx&t_o@{vKpݛ&{OJo]K\ ?Gۙ'  ك,ou8sw=}8]?f|uk'g ?~o;|9VSp64m|6'{oe}XNϮ w=:-ywٿ]o} x{hj6@wn.&s$7}r4zn:{uހA_BJf}_k ɔ E~5,{L.igxW:N6ksya3qf #xn'G`1<#[9kWީv]$Œf- MoT[h}YNz7?aH$~!37vT´^oe15 ]S?ӡ5oq?|gi9;`vx>t#76gvK>{8nps+|q,a]*IO>ٗ[V@߳2fC>:n Vu>+R},yDem嵙!Q[.% }FBѿ4[e =W 6mKXBemE0d"ɽelg[x8O#_cl/BLk!gO#qW]#lmWshAokmnV_>od4QLjmox̣&OO|&/S5w=؏y#י쾨cN׸:%OHv:?x]{tѥjS>,y}(2^bTp_ L{q\&M/b)f i|{i\o*Xys )l|)#6gvq^\IE"J^Fq'1vQR2Wxw~.X˥XL{tkIkDQqP"di0d%5њӺuD+B3.Bo0m򆹥(SOG7ɭ-,[yqP8Dn0oul#<84 `)ӯ6L8f;(Ļ{(]׊ -izG 2'hgbPExM{ULo\o4wmc8*o9V1F!9)5 u0 A̋hx6g>HԗS-@*#ߧ퉴cEu r "A[Z=^렘PCH%D¦H;5WTHTq;ؙgpT0%{qÖa\~}MS >H)njƺO[Va|VtUbl=mӛk|`qb'8;G'(E9!'=bzE^sݠW >T5_,J{u\FY"`$N5*a jA wl ry{]{Oճ*ﵠUy;C-C(4txs~x^Q% my t6k z1Dƥ.뽶j.jeukP5 -uV[o]պڶض(&Y׺Mp~uj,ýMp]bm[꭛JWv&F@{Va LVzJU΃eO˱NM|"fIwRiΣs,D| R܁{͋}zƷ/if$#sÇW^ɱ [ᖖV'lc/k6rosǸu=<{nc h؞< MǴ0Y>o/v_a*n-i0L}hr3z 9& C4DZdZB\B?lE©]u&)zI V$m3I2Iu4*(be3}ڒ2O$ou5DM6' HQӞeyj[\Q\(.Qn}(5=:ֵyt@Hކi~uZ, SQIdKL9C 2TO&Pa8&T &J_ _u-VyFP.RϨsI 5Xzn`B{ZIFVXo ZHh鿸ځ&a>Jt(.z{͍^J:Fɋa@Yaw:w<]j$5eKMMӤv^ŝ.xR]ԘWBrLgB"ȫ]`qxKJ{̼{'-G˻7`Zf'[7e5[Vl;cp2b!yؔ6Wջ'z;>N2VPA)5۪:tSqUP)QVaAYlN)o/QE!JSE5u 'um 9iJ5uX\IO9.:aqAg{ξE(F ` B..YBz.Uq6:.SC,7,] pu)=&80ݞQ/lWJTX6p5Zr&dS}V{7{\11gnVhRښws?`|Xȅ6%b[ Sl3'kc{\Sw?}YP= 6kѫb̂dѧd/q ,³2 Q,~zȒ$!c} -_d6J'33YwN/}=v%8 HFIe RCHxmSaCn6`j)ܞrW Z飴YiqަNH`A.yYNf@ @Ѓذ˞E5Fs x VGQ6"I՟'[3|gl]іWwfi b8')_p$Z0JJb ʰ"#ˆA&b>[܀?nyk'CEGAY %4JJ}}=1a/`@7xPF@̈́:Bjŵ`_x٥MkU5$sXC>-XrAu(,F<PB .d CeQYCx|3N7jʲKCI[C5z @U޲hf]Yt. 9C>c:c{ u$v rI #]uSGNDg- P8)GDSO+|$[:rJ= ր L$!u۸@Jo$~q31DқV:C(dnq١:r1X+gR:+[WEL_&agN |P %r(D$(T*+[Wt]iO007e{PS=зijp2_4Cȣt?{O\ jCI}鞰"I)1$Ai YqQ@a7C( p @ȼv*"Tf-nZvx E_fSN ?G~o+q6;FgkD/hҖTӈ94A~2F4|Cxgģ Vy#Bh^ҶQ xPHx#;8O/c1D!ﳚB D5]ϐCPƾܜ~8gPŠ VKCS? 7X͝Z%x"1B=4d ]BT0fNOLy$s8B#uJf1caq{8VFBȷ)p:x16KvI;5lN3p(9f mqltg"x30l<(c!\S+Lh C]%.nu߀݅޵q$Be7gC~c{da#N/Y}H8S된!g83fȒ8ꯪ.U'光JYhXFE 59X Ӿ2v60j~d '(/؏)]-Wp `@IhJZ32*ʭ(z:z~{[O E) q)'<6X: r3|Ǐ?;Ll<~ųSu֌dY8 4:I6.IZ x/t70=uٱ|O޺;1/ۮdžu!0~9TՆ) 1*JTf㭕brUa:vBVUf^^F%}ͯ|K_WMx7W>)cI"'xFBP&E +PƖrO4` ,1ͷu?.\ojJ5thͤ6ӮLѼ8ϱK[MۂD\bsҏ3ܱ#㭈Xʄ,VZ8'T=26WW'+ae(˰_vk2|:}QcR !Qt£* ;)X!IZ\/0ȶldIPN4ZGqHc;Qzљ*Af-ThPˑBqg(T g 5F';SlڢPn>yfcuF6+IYEh鿯?)f2,4}3u`d"jZ~_ ߦT2|Jo%Ë'8QÊ/ 3+aK?:UpǠe0T`P+ q8onyL&T : O@r27 Ύ/?ZR| iK\ Ɩz"={) шt۩2K92{xXL1Ŭ0a#Ce|Lj0|u~2eCAEw:QZA,RBhqDK-ufJ@8$ˑ0Z 1fk s#UAKM9`s ""F c2JRdfښae :4pY ک _fƉ98*'!JA!3Pv¨4!+] ] II B`0b%(gRK #Ϣ7|A`ꨇ6)u[hUJju#j0ζ>ܦt|;"yemE-^bU7G3H>M^Z~\0=8>i`AD bz?kx2]p=7̾ܛ5= ~|D`m#[{x7LI |SM釮p}(h f?Rۻ~@)^҈``[f !A>("bK3*,E3f胴{2j{ѯU/PfBwKʅBa SZlV!WPE#.:HBȆLuR=&ysWHiGPs@aEKɊ wZJlgT#SKni׽ܜ?˙~rs~4~7O_7?e8D)kCMx#,˿zrAK]\|2sJݬ_iwUYιӱ\sKKK97J339!]sY!/s.=L̡ԓRI,:qjż[_s_~jniay$N+̭Lw'Հz<%Q%t,#8CL4Sn<Y`%eʠ_݊Jv"&"?}W@.xG!eI40b>7,o4\\0ˇvT^lj7Fape;y&ȓn ![\^ߏɰ` :9QkSiA.8Xw$fq 7U,W dȭWﻔ_**uq*9MUV 'uŨ2R_W ZL2Dާ-BZv}zu5FHJ|tϳkÿ-"K„ڜ(de'PlJ xn1IKAtD]jLR _錤 d*;Nk5GL|yTQSv1M*1Kr9i1no~i\Z`Ev ǿd1E1̍T =o{߆~0%y\:/Y14 | RhV؏lI^6vfJ}({wbzu(1%yx;7QE%XK[ėD&L]{|0){f~E|jKc|<90=g`g]L—f1 n&w:uNiT!yuu،+έzN=Q eNH"Pe|.S<=9zb|"; ߒo?? ̊GG>w;1$t_Ng|_*Wg)No;#О/-/Q}QK {p>\O%s]C8ybZb"ņc؎_-):~=rH x:T|f9p`Ӫ'ӫ/b4.;,bHۀ ډo O¯mڲR<߆f[Ӊ_nj;ct ՠtb9xཿYk 帰C"b~9(?Q1[vM/f& &S~#w̴*`(ŠSB3^XVH7ER zyP.:콈a0uN*ΨQ=XB ozh10=vT?HpC?|`4备m#6t轢UBZY Q1xxG!O#z_amd aWbgM7L1 j~r">L)Gw.h0%aӑ{~>=mUXȴbQa]2 ܒT qÔRkۊDŽUhHSߥEp8D㤚G9T`SNSjjCGy2GeF*2s@yU)e[9u̸ xM>.6P /y2To{IkEFT-.PYPn3 ,mH3ZG&AEtR< cjONU ԽIo,rx0Ek*>v1e!tm\U7RYZc(?܎ݬIyo)G&؅/[aws-@z+,$䙋hҔ~n}nu1諸Nk:n+,Dj&$䙋h-G"jPöD4bs [ԁ3 Awo޴C&A_dGg~nn|$01P-:_ߺuR?}fÛ[pO?=LI&IoM,R/\KaE OŊp^RM_Kt&`k2Kw3|{Z ̨K?kk~Q̸/EkO[ *)+=V[lKS3EƟYT_} ^=YKeELDy3|8@b-atrk W MHuaC)$oqm_UAijC`iFb6r5W9?[G+РfHDjMl.iXMJ89QHf4S,,ycu.>2`/Gz {Xwoe~!qQ8^3#;Da?"amӈE&R4l[4 v?MoiЪu7=5"}eb (M&.80lq:X1=y#G^DŶ8-! IڶYUx"9'RBO{l}C=6PHD}iw ZѰ$RS?gE!]sn [c}%y@/q,Tkb,W}ͨDdx_?.e[_E.ˌ\iXf.?lá-БD'̵,cR(іb ͔,I3+ց|ŽEglgWJ0y1<2玂3.`tɴYBN aʥ|'#VVbExbĘ$ud1EMZrj'(q\L_ FuM_ VSsA /W 8x5BGcvF4O`@0\j*(X8y(EZ)N" -(-AO+0WNf㞇ro qQzOժws7x3uqO"ǖv||sm5z6O|l,|Ƚ^sMXL`OWg惫Os_*'| ;LG 6G1񴋙+Sd'~}Ud$χl媓}-{="|XdV r %L0p-v%;o{$-F p Ah0?rF+n,%lqWk# 3%5e0=#R G93K*f6c:)Rr$&LZGL)GBLr_y _7دŀF /ŎFk<ų|{-s jۡN7fiKKl)$Am('+P ^HDR u{{Po57a]dCtF"I4tbmZ]ٌ}j٬rk: "㼧HnMyRAmJg.D!/~4x&YNj/T*]QIIf4IJ90nѹ@!Ec0뇵BҀ G[7_.ntb.4axJ`*I"O)Jwvx_oV,Kag} Gx|c*[X>e;-Qtwgjqs[ i}F[h|ӊch5hwL_w(ϐSk_`8w=b7\a7jfW7+=JzDZ$ܒٝI#⻓ Ȑ0 {蘑aMxT=ȿ`Ll ^98 .zK3ۡs ] g>l#-)^u5EV~N}zZ)̙t1sEU6Ebs_dbd.5dLR_ēV:lxg=rkGAGk6g-1/[EDW}BHm\Nzs8CT3K2bS)L 9vI Lib$)`43>ԠZS)đ>cBĩv`c K '\*]fLg(S9)ͭJS- [5Ք$G"ׇ,{,>dt+PO',A_ 1a |oƋ-b20ӭ<)=^;J.7nOZ>)`Eٵ{ieuf9p+4$䙋h- bzӹݔ}nu1諸Nw{ <ՈvjO4V5!!\DkT W;yGQD/lDm5Qv;c@@8t wz4EX&;͢ixB,DŽ_CQplͰgҊHgZ ȜfnP-hk>JnS/ܢ ٨Y Q~ ,|җ|pq棳5g8gա!2aE-JEKxNa_Tt싩Cd7אA,OSkf|on:̿xݼG'qBDg;T}YoCmnFQsw.~TH}doW]n&ĩÕzƼ;(Z t=\j3 j]%A8}!j#-AW1+Y[}o@W=Z Xt_]n z6gB{Xo  !њ~1KDXy'byT b xӲmA,X?cAIѴx1y0i_Go Z7?qm.)WĊU V)N cIBcw_xw8hu x&kmHngw @'Y,ZȒBvzHQ#ԣȈFK氻twUYk;$+! PI4n"r j]C sJƌqxZL( pQmj?M>F/#NKfξub=vXrfgaqRe52_7{/=a}+j[j]LR"ϋ?/h$T"ld&`%۠1=ְ*XXA|C[n4G -R+ Yi|wWy'|H-dIa{C61.1iǸ# WZ$c*ZFd$)vSC)>IԘ[."FUcJ逯)$X` -ZEJ)q\EHs *QVA6I[ᩓFj -%gxJ*BĄ"Dl$mZP #r5x!Z|2nk%*I1OVee"zi,2o}O6^0!čnh-3m64 s]EENǸ_Mq{6FrŅӣi?cfяoy+b(߳~j2fQv0qoa"j&}?zwtv|BædAD>|{_(Wf19?ltegw\OfƓl8n3zu8os)E C{|{m. aĬ 'p8nM%W;?w]U"&sUzdHgqnK>5s h*m}oڀu&M/M yMaZ.驵]ZZi)-D C8](xTѡp=r&(obGTu~/۟kHɳiֹ?k%`ƱZkZ"`H&^/RXTՖ9!/y!G]}H2 mMWIPq 8L)WE@o!H 5q>DU! ۫E;p$|辦lk3EҲ8yѾmKYw5,]=K[Gj5x/>stޗf2^=nkw~xf9y][qK[=4'i݆)0{baX͂۟|FqqXaw/@RCxーخ'JKp8\b"d;w uf{l svD":0ՠN-1Q28/h獻 /x]NfB_?mr% ;[ @rgc`n$]?)=qmcX~<06jyѸ.y^yuFu$nj$_?U/NH9U;qq!TǗQsǠCx}Pݔ:ԚJS۠tNxu!}/3ٷo^Ԏpkoo- 7W SfADvIV[H &Kj{R ~V!lFLa/+ 4'Ľh]i*A<Ojpq_dz͙+4\t4:I8rsrE` ^Գ#h'W܁Cޣ"ZuG;&9R)J)bΓ+券b5rywOxe#ePs{.gO2Ǒv okeucᇠձИlH:25%rC*FKtiÂP=N[,dw"# Z7+rCA~ڙxC85!2 SӞS0 5stN O`ױu`]z! z+!%'%tI>y@45Ż?;i١ Nli zռR*za4;AW yܐj")7TSJh-S+1b)S0C_>S>ާK"/݋t/ҽh.fļ!X"&H>Q$<~L$,b:IPDzv [Y1i,RCg`874`DD1U )M.ȜJn0JDӞMY}N}6M]Mc/"A>k.{.ӂL(,c 5Ar3%Ng$'m 6-vU7܎<Ѿ\浏ͧ.*hsȿ[RvqMJy$?4~dOL?#M4hgGq3|V՘1{?MP C׈oFwĻ*JD*BB,8ʴa+l1ȵE0eht;^?gu(WRok.ߞ;_LU};_z/_Va˗5BcH':ьpljc ⠍RD+ #(zVI 0∊&N[6(` CN;;4A8e: !)x8"Iq ѲY+0;}ϙ01DpI$F:{ZPȩU2% P{mojBJmxTOOiU7/5o)l fL>fYAeslhaP/E.A\ b&El a.6qFNjl`CkYaPˍ1Yë[ќp= xu]OhƄກww|)_@>YlL$U3#Ź͎ãYG{{wc 8FWjtTJEC6ptQ<$db =s~#g6{<< sK1] YӼ׻gE2Ǒ~Qm >!g2l6**c*扔䤤vnJP;ir.HCJ)lm<~PW NN.PmΎ8Y6}d#IT9i h3d?y[bHЗDk>piߡΎTsg4PSa=G:V~zs]1#VWvQ23z<a9Zɀ)J9`p΢YMa<]6`<0b1jS|_O<9ч\plȡ?Z0_gW)wسef!Բ 3Ɍd!@|m6ך@sŠRy夋 `%^ӫAJ"kuar2Q'P )< ?-j4gkyN9nˬFmEVVMhNF/و?{ % +#J,':i5D'GSGEX`ԝow7!% 0]ߏӇ[OvKW vANs~Dȭ%^BL"NsN,bɩwT4H 1q+([TύTM׽RJ"+ՋT/Jq/Qr MP\YkTH$"iʹd" 8DUMԱWPBG1d Ln96ݏ%߾qru1z7Y@54/TA,xiEsQZmAu^X6lWEWzxpAhby$xµ輘lhrelL̑뛻I:۩ ~V" b#A16 (~"%1FS-G-JJuQvAKEHNg- Z PB%P81X!z1]J@ГuBd#k&W:ୱv+Tٞ!nNl5d:G z e~%T!i^ aee8 }'\tua $JbBO$I=xjPGʥ2qؕGl~ncNoTT{8V@D`vE9C*Dj=5TBnNʺJrJ+ѕ oTInHR ؘcDM%k6 Cz Ee1h'ڛ"-g }Ik2<:jûoQl EZAs](d= f.޾c},,@ҏt}P;mfGrn/]2b0Y)B_s<̯%{HS>`Ú-.,Ț(VKo7 %!(qGV]8J↚yss*+ԥַŽmKLbRaɷ f S aPmyCFɔ.h?r;."Қ+7UZc}`8_|x 6L n{К}辡]W}[8hpZczin{ B4}m mކdCV=]c ۛ/כpzJDO69_m9e3}]s7WXې*9ںݳ+f8TݺCyphɕH" n4@?ƻ}>#=/C-e%ni|*rV;T2{[NiH7&u=+Ȯ"iѺn*d:cg}QOΦ1)hҚ9Tweŭ[g`1عF^J1Ԝݗ}{-۟{$=#uq\,eFRbIa67m4B&$MVN3ie.zn(8DW;FGϪƥn{d\.p/(&'KF̔oPCS0ǖG,M [ӊ18pnLf"IW9ns`)8A)W{m `KįLQaj'':uN`=FSg{=g9Id< T8mT@!PG0T{>RܥJ)+JOI-e'K#с`UwskdQ.rJrT3WLȹHۣIr$\/5f~w"kʁS|{C;mV7:3 ?)KZ&ڗbA>aN%6hr 6G>+4g ;U/"sPr`vJœm, `y>Q ar]QSuհvbQw8T\ U+Y6ʹf,ތt9ݩ,ޓ.gb.nkX\o`KRQ;cDMU9܁igj\w)CYpZ;/*̅փꋲ݂XbH?_KXbDgXA~.vi~>qkevm{O2fKWތ ~wתa/{*honhhoW]WHU/ }xRɖJűÈ(1դq`g%hE"-,E)S ki5eE6Pm{&?U΍jI Z3^ltȬ*I\2U Ʊy"32Qm,21ߦu;%L n%jI&J&X*L³=ɴ=L11șqͫ4?Z)Yec# @]qDlز'78> *XuWuXF5>RaV[2쑔]0iv,0M-]]ڳEzJ(8Eyj(hzL|6G7_ D-kWLF'jR\֦.ous*@V 'jm'j\kDs6-$Z^t$8T1&w'D )1!6UTbTK y3R)y0:Pd+)!P[gsƧe#x߭sߏ'̓7/iI5I Uyt䂲+vtR.h)E' :() TN%T|NQ<i%R߇x_ ergN #)\*x p8´7u`-VY釲vq׸B;%ޣ 88wN̅!\D͒.g;E~8ֳطYdd7Ƌa5Rל5.ngv,k\e< UkYK)_ qJbm@&ZA xT̮B] XM`z1@Uu'yALIϿՌ`ߒXO2:Sr[4b5e HS.4]r6|7kW@`Y̓UϿ߅_-9< <~ٷ~6) x~>~b]`]`]`]m'$ᤏ؈If{"8c7D4u!t2 AsFS-z |U]I\znvevi8CWYqݽ+d">M<[D B8%:+L@RSM|X{؏,m./F`UUXex!VVf7+'77&~4[a~c_S kj$&UzZ]4/x,! TYMhWJyR*,hOm+Gf& ~EX&q _LQH[%S"HC\JUAdsN[NHvԔI*ju&9pxU"D dj$h u +*hqI Y6NO?{¹V7=ȶiGe >Z?M4 ȻO̖x3ˏ~>Un "H?gz |hFz/Xed̏h @|h)#L3om>XP-L VAa[dBjl, ȨHOлgq^ӲxbxGAپToT(e1_ |՞-F,m1=d1\ާPLxp<=?+%% cc? -B,o a|7!A!xd:+zW3\Ft]5ܙܟvg%uƹeS2?@@fU:!RK5xtiaטk}FpvuǾtèAW+4 &,k`GAA ~0cdv%!s1>ͮD}}#Ys:y"yJco\.vw;WQ9ǵ lqe^_&DhBlM-s#;**Q9ָf_Vc dWǛ6|˞ٌвGIJc=R2JfH.^b3DOn 1t/Œ!^ @B TJb7QKJRj%ՂKJdMMkfAQԶuQD1}ƣ^}"x ^foq[>`ΧӬu u8q_Yga ={P= z{C1f~o}ҕhiOb ye`.z8XQºhq}I[Ye,+s B=[^ǗX Z9#W&oĚ 3.ywTc:Q*1īQ}<ݓD^UWRd:$䙋h%vb_[U bDje.ִ[j:$䙋2R2C%Ȥ1z n=\/$L2 k.u}maLdu5L?Ćл>HMx>at~Wb:Fၟ "h5K6 )%O!{xIeHdxS%!*d':jrc5l{y$v|B [;eKc)m<qq+`"y)^ e@!;. yIi>~q$C8C:^<}牳Vb)pگ L܀٤o#!O͢$g͠4+:QՔD q ,$Hm/Fd.0Ie7 Xi ^ۈ30qJeKa`("T2,41z$ujSE$pi[Qub)۝\ɉc^\Exr9 W@C@JQNJnTK P *d5"i2mgi$HG)FK7 827|(P> cN{3i)m2ɗTfr3#X dt ]} @ agI f$j@5T>/KC \ $mj ^w` Q'U rZ_3 3 gJ"m 0$ $; sFۑb>uPn{nUHw.!2vgy,8!uiM>>ۋ5iM(X4Yץ wI>prN_M*9A>Yܝ&X㺽^wQ8`mFY!m՗#[ [ܲՀ֘\> ԍmĖcҾ\OiAH5rwxf?sfyd^'o#˟z_|c%leowWogf2ʿ-Qͅ[DK"xCieױ2I[IBSǃ nU4$I.BDrNmhhj W[vb[qIBP2'&"=]=db׉`z1W1h_+80}9ϭ{[[؆7U[jiR>l@ӻ 8{fX%S }݀ͻvq~G#1,de,)$ٟcզZLH%Նj=- 1j]DQ@b;4(%ڒX:yV)iJ/QS5LT푥B1UV W KK}ci髷zQIR_MC:3 %xcƋ p@X_H(_5 Lr};y3x+_8xe /$+$;+l`>P<qǔl>6ꆵח;^G_׭:<ܩg.R:71(7Sҋ"H:$1:W11]nˏ7"B5tmnNltrIyL(:(uN'|!\?WO0.Cy^Ѝ,bsdÁdZ٪-@TW2R OeW#e)jn gOv`/A=U!;hxA=(}B/Kol%T&g7<'6^8!co8yq`3 Y~8^/̗3~1ㇶ8}K*V@ZOdz'edhk(4ºf'5c3∱.u =X =};AH8܁KZH1`x뒐>) V-O:&-8]GI86o.q+c)E I2J㈚1FAb8OJ[(!T 'lJv%H\Rq4{EINZ}'2ZUGx!l>;cˢ zg7j|<Ȣ٣h2$g)]bW  PD"k$i x}A#  )A-ʊCG*$m1R47z(b( ht 4:c$?L "&#-yw3P=Gvmɖ6 {AoA{(Td6BkǜE W91XE-L뼘y0u:40nk/@'gj{XR&,HO:51K|3p[ùo k8aFLb3Tt 8@ JӚF!b($ƒns#_{3o"& }P3Eܷêu~Ы]'$ _ af)уeڗ|MǵlL1(0<ܷ.Pb.kxWPor,7ۑ#HO+&B-oooJkT*&2!U(D@@b"ewc"*d\_+ AAV@w]H U϶BZlEdc q42@QQ'H&G4Q*e*^ۤ+a!2Di% 5ؤTؼj &F1/)d J"ơPP;<+J`s{>bsVb rg|U?(f+I>ۑI u^y3?僿xy~w3@p:{~s$X?.͂.GX o`}sz`֋d:``֫gWwEvM>_3!ʜo D̖_ݹ|oh#N0ǒ)4sTPf!'*ZX1 z5Oٌ>䵨f'o~FQ׬K]銔 Al$-ϸ3?|$~U.`%M1^n#F;+_>  ;񦒑0)kp' )Z؆AN:%@jLb nN=J dSW)bdUD7`0W+BnyY2wJU0%2+$F* ZϿ=c)z`sUGPˋh/jm'/7t h1vofbܛYiŃYN[n=] ˰kwoQpk }Սi6~|!idb/xoTؚR6 @J*i:tN8]oTQXRsƧ5r s Ft Я";}e`e[ K%(]A&TG(.d0JQ Ĭ ԒR҈xB }PP)Ї+a t Aۜ*1xL>D"YTc#32+412fD UdĚ"3Vȕ^Kl-ֆM,B$ +Ta(@JD%<% G,$BjL#"cQ!@|lΓ!a"0=`҅ps%ضx 8^\S b- r.n 96`u Ǟ򂏨y敕+{>;O PgexՏNjYrUtnWdː {"y#c@9̾?_io>^ٔYHyϦdWBIv}$;[%K3t/kj噹y8xA7 tfz<6':|鐥f~1DtyT]F2,0Ь9Wy1sŠyhnt&S; 0ܒn9yu13b3x9l#]+q(2KbYVd,_؉?U wML}ҫtKCk8ʮ#5d~w'>jq8B%zwpl;|e,sŻۯgW7?j|;%敚L> a^Wyaf]"ӃN{W.ގkW5E;V#0vxۗvly9CDbt {8u oCsM8YML sG˗';?sɫٰ߼~{ t^S^?f?.uդwڼp6zr2̦^f8{֎Tvi9}wgKd_A_fW~5U >7o6hXbj8/߆T=ׅ*r[=U#eW IbVwͨ{oߙѼg75zbR:5H~$Vmxz8~r=¾_̲\')g, oxx=ʿ5߳dibk$|dγ~qd|9Ǿ^L GzLtܮt7o}2+Nvk4[|nX;EG>abffmvہ y({ǒ" 9n:-_9rՑ޼:Z|57vb+S6ˠQ&aCh0dh4HA,b[T\ ɶ:`;sK5ߟu; Zv/^ '[ :N j.$>AlاmώfD( ]zP4ޅ[(hzFPҶ-y-xRC{fo: |˳5&ȯ"vww"5uIt}@*']T0(ӽø.Įf?I ~5s^]]/:z.](Ur`5CrWG$/o-ӄs{zR~e!l={Ӳ;Gmjgkgx#_+P}0Y^ϓ*]Wf(M$@)Dl-| 5C4h7 RO`iK!ĤTY?u鴰w4n:6 LgExnzON*/Qu|APPN=ƜcgA{1 smhY][ql͑@X͵}W߽sV^ Cl=Jl-ď|jl% '&ZF_o,eZ!RwnlП|>>ֲ(55أ^`i XoHiF0,n3wf uS RgS@|_9JPm5et.gX5p2V[{B[^hW==dj([G cC]7|ݷ/яVLjDq7C*32Wq` -~5p'~#m_MN G4[]"Y/])t?QevϢƌ Seiĉ 3V=rS!pSiO\nJQUŁTFUcLUqnup?ExJxhuwxk~óVvx&m)$1wPBl^DŽȲ 0πi[A с, ~d 1vYhydYWN{_KpB Ow=ӱ#:=IG9IN.JiGX[}W2dΉoݘ2&YyΊ>uQ X2'Y5GNmAc$fPM2NjϙͰyQIR> jƕ({43JXD! `Q&4[2-1b),NZ(?9CHNCadȮ8h흐yJ.f%:J!akoQGED.\=lpؙYXyI.{_KK$ŦPD(f8ph LȰB9 818H("Eց?$I&P=*)5 "qccTFcDCG!g(Q@C%*N4nEAH&TTqiZl;O2$) bWD\)mK3e?ɢ6IBK$ikp2P*s&D![%8LZ n>3|c5zwNvm䶕Yы*4S 5ٶPUH*3\?-i!hם<魕<K0XbjFe}=0=-5LZ)\y{%|:߫, Ԟ_cKN2EHFT %HXFD$`PCtY&I\e*hRv*uedC}P YHoT!;fiwNJ`mxW~}t \a|aE{PB/43Pi[ZWQLqsF W-uv܇mЋ-ϳh\G5T߻NkY`I:&mь'txGA]v@ o@ 8cM4Kg&·RNͰ]Ofvy1 Qt29[R::1*ܝ--83xJ#X,9zàN21׽mQx}= yY٩7`dD_nv :=|4>6OFkM.ɷn'~PMjץ3rvZNOܟ ˮ+ݟƠ>Q_.^~Ѽv>41q7Lƽ`/Fى_w/G j9ɧnWS53Wwio^+-}vg NwaPd/ˎ߿=\~]Evt9f%f?;wU v;yCl(wZ}n ya|e%Lָb{ڔڇ5أ8LPr_̶̌ܢj~c y*{ vAfV8;,o;Tw d& 5-&w.~\JmrᅊR*AM.ftޜCNT_4GR,#?E>{OAsLjx>,ӹg6wAoPsN¶FF(-i aI$\ cM&ǠG ֛CMSdB&qh`\saTI!`'a"!X'GHVE9‡$rDAn(9FQ, 1dlND( &H+03PPP8!!q6-5JEed@$1@! N%SE'.tЦAb*Ň>_zBA)>GqWh"v[X p$vI񼅇RբYYNܟ|YȽXwugَX ``\ #cM'\gd̥TNG=L{U@.$6O\ߙYtå54\Ǥ= aI;IV" CX&@OgV!{' B\~H d@tҡ(Sd1t*o_kHS[D zu'8W ; (kWc$(5I1O,LS0ir%2hƤZ4|51{qQ!gxڠ+Vt{~d)tv:Y6nE3!l5ZI8MQ/ sb#z%z4}EP\ŲpM hR`u{ٌp~g \3rM$0"]{AR,,>8e^%kZEږ+"5Rq"TLEs mHŒ8 q0_ C#A-dq P6bT̩2&e"#m116ޙdh=vۮ3wz6ihl`t NYJLb߆P$2!c,bE& $'4qG_ 0Bc<F43;< ϝ?Fu [:ncR[Q%_I?\P¿o~G>ROdFW]IU/ -ƊZV vm6^kCpnq8>^Or *D?+`_,뗟V^I,!iL]j7ͩIMCc* Io 8D>^W `v]w k^,+/ÙڪC~>0W&UkhȜp;eR$`~2aLTeP5`9sF\C㙹 =. ݏ$3hBO=CY$-0g6bOEeévV;4rc0^*.=mҢ^Umfݼʛ Vf#2Ӝbh8 Zd.TUlpm$O!,&$0zЮ䠍s s?[obcc룙1,\? X7v'nq4r, Bς8aUEz*A(FdƤl`մ#W`,hX@J?}v\-r~{QZ3=(EEFE!S2-d mKbCcht7_B]QAaI`m;2(BAU!e6Q2,@hVk ʌP64(P$*L34h8TVhzAjǒtY5 ƹJT YgVР@b#\0B\]gxL}T_nRb 2QNN No Ӯ3;_;`1`&5f*'6hVn>Tн>#}ޣ]9URKgC;BǨԿ{4 r\LN MuhYa{Ǹv6Gu l}9;p zzv)>LDp}$Ȟ 8C~Zm肳W1g=yrhtP3r0&XڕD36ǧz=YmNݕ':qSjc\^V>.@@* I Qk]Pk>Sʉl>x!A8-Vͻ! % (K%A(V$tNZ-bz+k/JdaL_#޺JB1ٲI3Y6M\G! QZœ{O&8'^ڊX)̟x(ffǮL9hBt|^7vAt(/1D{h*'&CRw4chN_=*t$2`:ԤCR^8.;3C3ʄ1k 6*ZQoyÃ3=xN.!y9˾*(#oy+ Nx4,MY)$*D )/iO*" +-?8A]6f`)/>0P7B{@Aأ߿7Lx@=Bd M%E`Qq<_|v,B:}fl6/F\0z#j6S6syAz"0ʴ,}.NcM%oȽo?vu?G H*Nۉ|b~PhI*i&8Q[TC5𾥬(>rSr)[C r—J)1Tk!,tQi;*PL(t(^;r}(DxopkEF$MzϚ5SH6Yg 6]a8Ij BK1T' H+8 dsc4h椅@MTނRa՜Th;PCH@}f0!R @H>^ķ!H5,H!6%o1֐Zf1,geGZ0ۭ#އ9[@%qLə&ޘA1/)+^C 6=nr)s+P<:Q{l0JL LqEEJV&JJ*l36TFq:8RBEؕm3jN؄GY-o/^JSW2&oh;䴌C*C]zNNJy oصxj\< @;U :Uŗ{ߕ{zlA4_7לe)ߦw8dm9,Enw:Y7; F%簃 ]_Z[K8jP.HPC5(~ i:ByOG`r(1JIIZʅޠ`+oOq1Q;D!8N >1r^{vE}zD[: 2]Ȃ:4(& odC^|."[ubMgc S,(=dU m<܁ϼSJ,7;ʼnu,| S,U(um :PY|&Y5RsbK XjPkՏ oصx۸ W@6N~:w?.kR ~YWë_"!<їn`7TRle]tPRm(G"70p*ecXJ+Ӄ@a&'!P T0}B2j*$s#5J$bP %a6O# } 08K*-\O&հޯ;-O1l٦P'cAijq~Z´9apB)y[2}#`C_c؊5lSu2Qѵ(N$N91RFH5 ;VliaJāΊ-]OBaE$76vVT+{>i{>ZꌦJ;5u>9,~3]f1;evnwЅ'IqՔH%JZ͛w7om>$j=@m:FHd]ƨDR6}޽; ;v6/;ԙђj绫2$ƏFfoþVԖ A )-avT1nuRQ@H(lŌ8Ґ>R=Z8(Z 1o— 2ré<Ď="X5ʬ4`21*m@HlfR2"QY8WŐ'El"hM䐤+T9V"8K8J`8i79z!3 OQ e#zS(]1é&oXWz+yE0JcNqSpU1:;!9Oi IhYK=ET*b(cۿ2Ň$/?S8qRE6xb!) ۡ!f!u}崆8/@sjCkb, H0˱9 t8wc2.'iևZ>5sEwckR$̀dq H4ܙmbM{q\L%!IX@!a{bĬu "JkZ>c ǨYAQCĩ" 2FZ'b3,*Cb7 y |$UTd. e!yD A^A@Y]JCC-hMZ c T̃ L+v\NrDu4䀩NmjJV7މ'2DZQq_?ʀ胉Ez^Ԑ+rB Q\7} ӿĕ4i Jl;"+ts۹6`4:x,r)8[4itٙ0:][ j7,.Q~N!_-i n74eOζ^n lo~x{A2용'"֜3p 9 .&SF!P/ b]VO2r!* [_ʚf`֣ 7GR @CMF'"LNscp?F>ȊɓpqP8̇478.QyGCwv:'`| ,dH*UCshk%,H{ $^r8*Ge4$IIH沶UQI#5[m>K?̆(#<_bE*,B5c<Eɀo }d}99gQ$-+sWGKgرpǚG꒝Yw̠|a4rRo|>MŻ ٗ)܅WܳSGBg7=Y_J;.|V:_ޛ&a: u2o#q Q1N/ux쏧hhe_u9|`{5ԫO~l?/~}ߏoeA .)^'˼Qr'g_X"P<]bSI>Ѝt2 ^mR"nhlMvާ1H^YtQ (ejV!*UrQ_*FdLlVj|>ifyљ΢0u; N4jgNv⢕Y|ID-v,"XNNapB;*\i2w[<o__!@071t%TWar˞~\ ;Ӌg)ۋ w/W#5{PJ_q̌W~uԃ{ XWI~ jt|Ep2Elx Cj4|=M? 9'R.z+\lh6./~dMyE@$| EݯNNF/o~fEޏ><}xw>;5w_XŎ\hwC _]!bRP K/TJDz\4z037IBs[qPPr\Ul"62cVvq]n^IPކSSFd<Ƚ l4Dn* iV`w5&i1݊bQJ=9mf\M 9n$J󍐶VX^%rX,M']V012q_r7dbJǸL#3n3rkuʈi@G"ŵc+"<Q[77Z6}w0n&L"ŕYM_ӥpv5`1񍌆546V@(-Os]J@5TKy ܒ6i"DĹ~N*mqF*9.>@+ r3 ~OKQKOؿo3185md)ZXRdzdzeU)3֤Ri.ŷ6iqCnG ש%elV4K9ēN`e0I 7!O"O{L׋)$Oww"MNte7:9 s{T 1\ On<kW}E~1V: `q ׂ|qk.UTMI8)eE;_Y0MLQøPj'7Kf6LѸv=0e$>s_l㼘N>qtb )]PQ &\]Õ H?l"\"[()ѧ3 }/y4ZlaMMI 7dD}ۃݧ_ y7JZ2 \)(6w'Uh)t5L\? S*_78Lb񹓀LY(wW wHq̴Nf<%2gF S Rres"KC))M+x{nClE X$kYƨy|3F7®Սk;֥Z[Fg Vײhպ懐zCx4amy#oUC`Z[ܺ qV]D.rF$*oI1 ^]uCG2$w͇GЊ'y#Y!3t,>Q;;?[S4S@tŤbe4/WX+Yp$IG2.^bl$fNjF^{J3WJ$P|F55]6)oh&`2VeX_2E%Њ#Zb9"$$@ڂ)..x.IGi|g?ۑNJ\%,|U󦞐Gs ={xSL4/ōi0?NOb grP$}+ )".NN'.ZdyJ{G(GV463,!Diz6B[ӈx܊Ҟ1@B[U:߼9ykTcD"ǥl5LYhK}\q,VRo[xI@ps}6|`]H n0hB4i'Bƀ @ݬ ^?c!A5V%s^ ,e&9&_/[-~M.I;;]sAD!#=sY;r|LjL{\IY kg_;vx@穛e>݆JL1? ԍfq|Ehgk_ x[qѶ P?6G(oAHTLU8Śi b, C*` dRٻ߶l{oa~ nh tC0V#KZ=Ҧ=CJ2"Eɉ6)s3gfϜ󰎋9t\~qXm$9z?e{Z4%۩&|?el'A*^7#SClf(v E d`a$ Ö>Nst"lS \H#0*n Yoe+u__Ax2b7ӍѵTZ}}{߽.]9i~r~+R7i[tZd8<;>L Ld(E̯~mZdZ"[]hȥӱ.e+d(*("ByיxVw<;+`CF=Ma$&&B2`dR2-1bm_;ƍI7&YƂ0?} N5Hz_׆ r81p&#C)V1`ņM4wXk&圸FMj,8Un2=SٔjuJ& wj?pDۈ"4ږןk3)d-Ķ1x+-rkiCL24IMwhXuH@Gf] !Ҡ}NT!W ]H-(W^go(\H|?^cX]LEdTZl86P VےQ3I]773>-8FiEc#r|X$0AGD%8Rڧ7D6tN0y44+_Wb }eވi!ОimN^( L$SǷk);B;%k@|9^y-[䕷ȫE9^,aB$\:$M글81LS%aZRA8^>|:x)6 a]KY_@@SDSI4#iB)&VcdZ븓>X, 8aql++TVd_)[zr =VE*(F|PP߇G&:{pT8+L9' J%r(m@(9J }*WmH*P ):u",@26I4.ā~dB)r'b1mZ>Z)}SM fB%Gf("4[& HԊ &}OHa奴Y B@*%cQZ3 S0&`%G榑wMfr}&6Emx$9ml&hXU[0j& ZjF4&@༒6֒7 &:77-&4=\1|ӺlEyGCw-<.[&;-Hq)74󷔒)*0S2-&b[^TA$F(HT"A?SZ24_I,^MQ2,~=O2 ӆఉ%) JSx Z ܙ0nj{K\sCK*QMu$^b#%VeXh%H ͰPOrZjUöN9hU+w\;DVv@4(4Q:!CF'PK8WZ;0ZRU0w!IMUTڱA"IYʑXG ,E1n_]F.X4IqU_"eՏc9&]s#P0)͑S\8Mi05# !ҒhbӔMiJGqs3j)-(Xzd,TAй?PM(?2]~r) D U_cm RF⩧՝rVRBb<,e;;w^|wn5/ሚw @mHID,X0'+g:jB&_ ]>+oعď]Zj۪B> `㶪> 䶪BrdS\Gݪ t>w;Tc*Ycޭ@V|&Zɦ yB:\nZcnUub:UQƻrvݪnuXWn6QMqڿծc+,V6rlP^0EO[  u`BP,$wD9,B3'5:,L„EF`c W#;`a4`l4kB`!MG'V"xEe^??}lew']Xda}u..jUs 7vS7h 4UϣZ\NH3k8t#gۭw̼Ρq\Sd)۠r>*Y۸&kh 0A\Q0r f@mC2#$,m3ҩUhvw_?>0%xt$}q뙼X_`(;ꜿ& ۻ'8 ~e½7aTYd'=x7򺪷w]Ͼqi흿qyX:"VO3}{뒑N|8z Lۗ^W#\@#l:7{0F&wpMrZ˞%˗ϯիwo߽yz&7?_DϿ~?\oٗ_z<=wqB4Oۗݛݱ_Nax9yUV_P8-I3NFd 1U3MyvO,_>i}/ՆٮS/a|Nz;7' 䍷A|#iMuc?zN2!@+s/ij=(oG-zϿ;ǟ^}y=~n\/韹SM?&;2O4 &o3@J3 ˹O~ XfX;3qD4'R%솫 Kp"ĬTqNek/#i7,:$PАlf/ 'ꐠ"X:8,1jEE P2L=3Ud*&kǐ3zz00"A~E`:+8"Q8ggk呻Jh y)T~> c%R3ZS^3!(eűqwxx8t4ELq֚ScdžF[1/e2 UBanԅ<ͽ#y1[G6%Ol7fʽphO~F{kH=at=P δu!p\6IþILp  ۈ Q1…PHKIkTb+[ЙR*Pr'3Fi%3ΐDDE#V 0˕ X*b'uBj$c8MM $AI9JҔ%&Ȫ46^- -mQap&B0ڇ *pbT"ͯMRŜMM/QUs{f%fTAo- /`z9f7Gr&ȯnͨ$KoDwb[a\qlb1'U/P_`wYeh]Y {S4:ޛrU0t}уpA@ HBRf69WWWp~4,P<ֆi8f>2kٻ魅FK4&>:B6A ""㰊ߠT92 >\1"P20qZź+{ܻ}w]L뫌<:sIdC4.}U>L,CO(}5Y LG5Nɳ`L Ld(ŵ3ɵjך$K>E>w:oS2Y%̅z l_Q <;+`F=ͽAILLBDbiMo!kNj]v\?@vw# 5'ͮ7//O:%-ɵbN #4lhWɭr|;8=O\4o惓# e*͢Z4qEǍ"qKH I0`4~ͯ'G[ܞQՃL1կ'<ˎk7i̅+'9eX(RSzPp{ 0(`QIuzlnrҴgua3缲.6;HG =WeBmwtXq6tߑ5suf^7?a9F^ŤY~GbUG*ݕKq5RLĶܳdot_R\& # mFVH̡|JV@t;U[cEv*+s 4Xs@ڶ٠b,(?`Y٪YF> ib=W,O o W1.drT l szL.]d})R">j.0Ge9U.yQ$;-KFvհG;ZX Ls2~iU5](aҍ^l6_cY>Q؋ԷdXw̻ȵãKύ{cLz<}kt? vjK;>ˑ8'tVoHCs-Sҽ7 ^V>SEmwi.uV>c6!߹)g~n{ổ偏}Gt:H ʪWDMhwK:r9mq&4;WR:%ߒRGB*gUM[ckB9%oo\6^OW34q'm}3 5?HwnW&vnL> fN3BOџya<ON0\q[IN*Ljf N?;-dX~|δH+7}kӣ)VdS&E!A6 D4R2-1A0'gľ6IwH єEkΤSfJ ":`7F`G~Rg$QP@L@b&À8΂) E &t@T07H2h?#7|X S.1XnמC[4LlA{+K+B@_bVgꚝ <9׬3u-MlՊl5 q|UY~у<ځ-۳/&-&қ{Vp|zX$S1h6"Y8HG1{/- p,#Rt eICCk ;"1! Ml`Yx0Sf*`,PT_,SM@5`4`OMp DZToTqB<G xe5x6xPOcA&5xT  QxPG5Zq<sB*YG=wϸuA0j/{j0{r{+$"5xlxҳ@XF5Uޤ>dmQ eK5x`nJT3,T  Xk,8S*? 5xLjFj*?ySl˙`5xl < xe)N3cq͉:mZK[K=ccTw<z}vT_,SM M!LRiJ? Ά <S&G5(ogN;s:<;t?<LjE3wA4i3W x0L3Atc << J}mhsHzi/\  G)oLX֡~]:t/ ?erTuQAG!ys)jϣ O,D(87 uN- )_<뛾}YȤc_kD>9(ۮy}mKb W5&ڍ+" z>σdhV|2OIpbyrt*WJ+t.t0_ꪤrБhǗ"SZ&sWwU)"muA/H:k0{W!;hiotg[5jqIz'0 ߏ4d4<àvg<1l' O0wfsF`բcW9go<n^=~mٱpY_/nz f]a (`Ve0дW]L͑{cxΪq\l//_>7ׯ]eþ~"7vvo//~=7^7eƫQ{k&-t2M^&#+8{ ƒg ^w{bzn4p #xM]wpb k{VqnAL:awSY^|F6#+R~< eߍ"?=EnJf`YFهG9ĹL6Ik1i3oAsàٙX}83OO0pd͟҂+3~,w,r405:Y(Mpڦ+Ǚ-.'# k3MzMYm|:Ͽko}<[\Ix~fdKU2 >,fgXw0j|iD).^&G&gaH^{q_N S&Kbn-#5sGhś|Z /%e>(Zzonypsv7(y߳Q;C\]dh'iWͰHaAՉd!ܨH"J 1eMDJXp\?bqM[N(P*CHJv.M.ə"sw=S6aO`''R1/;܁ sOudq#Pnhv <0e`ʫ~cN|# z^j20͙=w3+>c2D!oΟ]_lg%B=-E 7ƾK);~ '0]Ѱ߅<ҌT +O%XS|eo{_.s;dxAIΚ&&Q >ő xcs8w{<oO܌Jh9 ` *1y7".Xu ϥs p1Z ̜ǫ?nO6[(6Wu{&Upx:<ԢC,H(9D0P:*I i@T2&"FLŌi5hj\V6r[ASP-ɠv֑sy̘~WXj9`5vM$LTar8#mF5&9׶L{?C1w]#Wnط ;eۨUh NUƪU7y~3%V,uI$TB{N%(XdZ"SA7;I}iqҋe_n=ɨf<zVo0،$*%`ZH &>ڼ@A+bY&>X rO6Hۙ$tFJ>voڋ(0$ԈªCR,8 :@tt,xG0"c_9#sʰ@|CҤ$FYFɝ J yxmXf)ڍCpYz@o0]qU“Lvo441@ADM Ú0lXF122g,)W +#Q8ђ!=([E 4Jq(Ă|LJjaJl`-TkmX_E=p|)8-602V#K:zM|$J(Qr̤hl"g7;3;(P $t !a C3F ѐ!.H0j 9dKݹGI^ƞc;C/wJzw#$ MkQfr tq>툹a`'jus N&@￁xw㭧\%y߱ A՛qwoD%m1VɣFW/S}xC74\yc`\5v<3̳gÌ_'7s 6/CX%KNٛPJ ťqIj:TPqGv߂a,rU,%hAlA_4M=r1(V6,/!لTIno?W=!=c<ڈ!{s#F3g]/^_h yRk!ȭ?s-]:b[ڛQyMgs_i qgn|84퓓ҞS2+7S1ׅQll:FoغfCMHe-?=(Z ktRyl60YNxW._pZ{%`8q91yη& ͕QYOK[W򻳯n5^r}͈~D1JZ9Z]9d<+[ET -'$X"j=?^IMVǥ#iiT RK>پ|_|ml#[@HYi!G,( APuHY 7pbPSmCB)J\n$İGV>SE>WB^v+;ݶ!!\Dd3R˕›dk= TeV^t34nx76@Шߞ|RÓn'8I_mMG^'黗ĶU*Ÿn#QI.,Ct4ְT_8x aAmՁgat%a@i$ X0IQ K m5ik ˾ayCݪq\0 $[e 7xsA=ߨtuȁ%BzaP'(Ň %)U. +- gUXl1x`M٪`yZJw_uc,n$5AS q\VD|fR|^h@Kk1 HT;*'4Y:L6{sJ쿒,n $ ת\*k SނS;UD=sjK7pIr23G\gyQSkZN[LZSLF&q"Y'd|L،0vc.jYؑQUD:xfm}-XE i P|A. Ho|"b7EN|m ܓX"DzLܺѥ=u) ,-`,\Nk]Nsa QY={mk/r!.wsl;,&]IVL$-8~f*-CusEEE1尺ћ+mݕOVFBg %ej- R4~87˗%FW7wR4~BFo L-2340 v{ӼDe 3pa H6,W\2TYSݩ "M+%ذvxͩl3KmL5Cg XZr[w4?`1夲JJېg.dJ$+ڍL}j<(XL Nn[g.d*SP=UAney#:eQDsA~lO y"$SSObD6~Sn{,wPn'??D1Ĵ*d?X01`YR , [ng4MElN T)$'w*5 GqC3RƉ5o۱=eHgQ; iNbqi4|Gyt;''66a[p)N,k`6ᠱxG{>cx23F:A?|Z 2 93eCʬwX Z&aĨꦔs"9$>̂]BC0(\+Nj՛{IV iCN(>L1ZW;,a1V&V.dު&mDr} Ba f-$``Qcqw=OuzWż4~`tln7<;F}6O 1IFn\F(Gnt:j5N qni|(HQ\)fV:B.r6QBQc0Ke#gnU/_$ w3|WܱS-E8 :@n ?4d4;gû0{۱AH'?_ZK~}{8v8mJ<8RMr?޾ܕ Ȩeې5@̊tMk|UfEx3S3Y~߶LQLPΝķأ}o.)jxb&glb2*Vq1,0ݨ dwO{ts _!Inp$p1w}ͧE@CBB`Ud `@8@Sb(f\rc#$q$nEgc?G)7Tj nצ_a./w Z"\YX:XybZTZ`/d{1Ѳg_ JBfi P[(V2 ΈR\I&!T3"U+P!9b;1 Tz>( q'RڠHQĂ(dhA0V W4|fDAcĄYFl~K5c[+X6% X 27[H=[*q?XO"0J`!bRR,B]@e0e ?ʕ,(1!mK%` vIuȦG6͕Ȧ(Eݿ#c⧇ >n%zeK0a=,EsX}eq+/]9yXjE,3Pk]mx<@kB"Ŋmto!Xab/Vx1& ®X4se\d4:x eNgy[#?j4|ݹ.{D,L6z@͇I&k3eO$d+W?O1 /4`JrP}0s6+0Hz lz{0$k Y[1KB1!4߯z؉]ɰ5_9'msOB!RT6 QS5 8Dyֲ+ٮ_bd`aHTƽYiCAKﯮ9/ּ[ :p\3oMAl]by XXtfUB d= >s82 CmKa4+٧H3HǞfʤRz TaD!Eq$ EQba(&Hc@cCSIv![LHcY %"O+I= fTHiFD1Dt,PbƙlÐHHq*KPPS'eھoP=oOnQr F:VULHjt]-I<$-d4;8K[f.5߱WV&)ШC0'wcSvqEMn0Ri#ڌ DI0 0F šu8Z>/xځ<;y/hrOߞ(`UZf`PюoL1dG\-Y t0]͉ڸ!!3伐z!eÂVԔ p BLbI W$4&QlQE H!W)"62 xkP!4D.ǒ9|tmm1R񽿰Xj6Dk6tN;o;}@8ZHq( bBЈz1ptE)H5 )NJHD@(ov4IE6N(ċUʾH|yڊǷzt]'*򍅙P 3&VDRMxR)[WEx[#$F HD4B3jƊ#F4$*`$eP,#( 0 P-#_ݕ7Ҧۢ/(Rʤu\tȶP,;3my<ȫV*EQ{M&EMpJUTTTTc['AEUnɡ`,Gr)J DZi)RSC.U^rt矈_BQgqwbcdA0KC -5|I8eeA Iu G8O dYL*\]wU8onW#] ߪ7>.H%3YOaG?܆Ƿa*#,k:~ :K|WW{}DGA|_|lRbq6⫥|a=5\IjmƚʈtIpXz9>Նm§ȷP;*x6~%y[5ආPMGJ CdrKv o-@$ a, :0ԫ\-(VJN4=Xѓ#h\>w8È*2F e ӵ2m@T+@;!^0J[7A121*e7փP$aGt-7 t-)x&oۢ((j qڬjߦ;5 L;f!u!w@TqGNwAmIs{g,^Dve1@CE O= Pa //S}x FdZbv8zn>%9v^v Gq08 ;ߓv)X"V3$߬RP_VJ0 SiZ~}qn RHH~Bc$3#GHZ ^K=Zξ~kOpHo0}j^o7vO&pK)P[9\l%>QWJJG,V^?78~G%Aj{2 K 䋺>ճpWObZD5v% 6yD`,uG0DC֥[l|FA΀ | I}t!a߭:Ұq ip2wZQIԎP'g5x`B6]»˿=nR4fwqvS'hhzs*rjh`t+O* 0v&]T{WWf }w. zsģcxL0~b6/k&\UH}G y@",żtLNMzʇmU TAzk%68рWxhɮzuh=Ig.I2] fǠRi#:>CLWEj>$3$v.v%A FtRE0 2F`-IhvCB>s]b1h|Uԙ*yMg﫟Qxoߝjg瓛|p=_8_{BFa9mm+"9d>A}*$&ͼOt`lܽ|] Ϳ|׉"k N)r]Høx8K^׍%!X; ,Q.6S OA4Q$O<>a}܊& `ќ,WN 0B㋐U2GB1i_߿q:,y^2͜Y` *e6.Wpc~5h}v |KLZ̑g,}W0 rȶ1z|[)s3Ǥ&\(c"cDRl&՞8r]y(Oc@3E"8A6GYc \CpsDԡ;T\#4EؖL) Go{/ס6ȣJAT9K;:U\̟B6L*9dAB>sM7mj71hTĈN;hyS`-}JvCB>sM)hǹ8ivV;vKA褾v;` i)ڭ EtwT4y8h9c885ZטTv@^I4{}ށDj $&.fĚҖ[ hHJས:){ 9OJ\KK \E”+NiSxdY;E3d+xRX+漰r R2K0$}{hZJO 8ՆlGs$Dh+Vqڔ<'Vk#EU")sJS4.u11m#yX6`4IG02>roN#${C0%-6Sccb%@Ŏ` |CcΑBaJdK۶0JOD`tL A>_v_Ϭ/;"N!8ն moa.ByW$7My10Y`P0Z1%l |0 ՝_\;4P(o9}Q*=\6^]UBӛ;4(KC}~k{T)ẳμ.{좷?5KU:kήonPyݹ[i fŨ yw7¿/Ec Bo줢O]#m0yhg.Ȃ=Kѿ{<> #Z(ZIk f %Xf=-2cP@^޺dQ{lș!X %wi;yF J@nmh**EB <磋iY'q\w[[7H^p.$#{J{@ rY=1Ƙ1 %FB SE]-e1YQ@b2 YOAT 8n\t$ZQpt3R ~FdtZJ:v"[ډ9Q|-4]P4r@KgTNUiEskMZ e0-X!V(쮖$@Z{!/L]X?Ǜ`I-vjZNĖE#ɖf!ެSTMhfΒHucn%=H#7њ^~nKP:Ei1V4o:+}ۯH٘(ה[_2J tkimKjdko*Ǭٞ AAW3YmG$?U^}:m׹a7NFkE?N5kɉ"ա1ዙQy#!vdxbX|~iիg=sD-wS{~+\`Z;#~`5F2u:L>Y!f;] T zyb0"R~([\(spCIiR swmriIU)rI0Z/(Ye cԓp{QJ凌㇨.La@]ׅog %ѱsN4'QVό:>O|C9#z2 <Ĉ >$B7Frc5 x<6b#~ -2 4;e-e"oۢrN _%NG=++]:O~a*'S'ӿӏ䎪 ύ֬1/%'ʊd%է3 4}sZ^]@%~;s.D7,Vg n>`QOGX0Dbk+c$zw&Շοqvy7m vc`P)[Q4:F]_F\vx' u=[ŝFIhC_Fs湍覊4C덤ROS x,$˜YIҸ P7=w ӧ_3BRXraozX EES)x_oQG xEDqߙ׳x3Kg4,Sp&p6N9ʰ˶c3Ub|#0YsJt! cǶ@~M6fxd~)f~4fs6I`\0/VF4;vPe= Uc,os..Ȩob-ɍ+~b(Jb'&+4s6E{#,u򮄂u`TR4q?>?Tb)]yѲoڬ5şmK6n|E7!Ey6fGuL$ƕmR^rj 2SHAWy7dλ1+vEx$|LF̰RXqǝ HPƽ BiKKiAVa W |J̡)$ޜ_MIO0)-`U &Vi!N)qTƗ*[FI?k~+st\=N!95Hv8$:F?\T4͢ܥeIÃKubہ:ux1g|etTF_R8j]FG1jɵɁK И{TBa|z_:Ya XqzqFYVAttF~ )"Y VIE)>UDed7)ڟP}UEiy!ÔxDnsgJ Fy!nIFn([8v,L   {]Ϙ0\C8FvV$#={kV0 m>ufK鳒ƠJ`HŇS<|P%ڽ>qI%vZd.CdLfEH :ՌT$a#BKMlt磶aކIg26D) ǂn>_}V*Q|<}xL&w~+.s 2"3 jb΅k(>؏.5]T:*y0Y2E5dؚHl+mXww<<ĉ >%ڼġ[3E=I4m:kf3몮*8BJʿ 2*epR@VhyrNIl"uKsLRP*G)RKQnB8jnkf>T捳{R#Tb˥2ctp!gVT{Q٠6+%1ocURC*~x*F*1 0aQ;G 5u@];cmdcG8v̗ɹ#! lbgE93#Y rA$|[wy5ZQ V>P#YYp2 $q ƐpGౖ1'xjѨ-7+D~%y)c+77;CЮy- ް >ʒ@:RPAл嶀3fs]Y@R1tP+'vw҃oGZ []h11@Q1h nj_$xBnJXԬlm*D̊(')zY|[Yzt]o>?96}[(@п3@}>-.mf;#s  ,ܠk%N2@vM2 Zpwb[xPKʁ2|_aF'YWa8Hycft (J9:hufe `{V~\bC\H*l׊|\E|wb"fёʷ(FEdFŕ.4/[HNf,=/(mNNkb.|+!e*3J…%th 87n8NIB=#N?"jr`y8ۢBs1}-:];UqVl 3ҒW/kb& 0%^cbGTہ4;5 cO@HEtMc%֗Ƿ-.5+ i0Isrp8"fcC2hbf~%jmKAW]ptDtP 4W65v(N:1ު-.ЅTR%= >ʠ!)BNDw!BhNn(>6h5cΜʎtV=kHEJnE3EJTh*(`6>9Y}5Fka/$g@'ny%鰜ߞbJS|a-e[*n`qVPWມZJWΦ;@} _Gv%=@Im`u7@v`=1Rɍ '1sD 3JBcsXLjۢqpUJ5g6쑴=4 MWKG X"u*荦eflze~[lFⴄb\Elu~vՂ}KuF[ s߄bõx!L;+rCfj/sͿ Zxli_P{6 b2#b:qSC!W\ NUT3#7dd/؈n>up֢q_SQ ~L!x;;F` ƭ®n2sKGTck/yעX07u4-U4_Ěo47čc!} $_JIzW6)ç5Qydv\ wV/apFy8dl5\B(Wϱ7Zi=-4楫kTIoҳ][u3~a|r<JUQ ?Oufi(nyz^hfC=R¶ٓzJ%=x1-ݰ*^</޼~7~}zVg}Q鋗WOwqWuU:=WvPW烺ELEoyQz <6XC)JYc@t `0BE3{DU+'vGΈV&^YVɬAC%!4:%Ez _ٳl9s.OXf Ϋȕ(+: 1sxc<򌺮DҔΜ k}ޓŇ0P 'JDuP@{UF,0i ATl VСLFF4L)8 &E"=u!JeqRsw̻b͔}:œPWIEeP/hH>ĬDOlxJ#i֟'.2V2 f0b,ytéP%Ji'Qc?{#kc>]~|*Af"!3]XB2SJ}oK׵a9(w|C,d"6J%4Ipi;Vj-׀Y7Gv7Q6n~*1t0<=U"IQIj! hkaq.D㶫9$0wykK&C.4 sPG!3BSDóftKhD\az0wxE&l"i) A9zed7=1yS [Ю W(zѯziʵo3ZFgrl>V$Y@j 2oy&)D2~jys@;>G_lyc6daV M][oG+^vsV}1 {q jӢH,ًzFMÐIԨuW5"sA %ѧC|tcX*Mrs_Yx7&fv1(pЫy(1Fi)Hsx,O¡!#=9?CO( 4Om!^.WQ "ۯ/b{?ncJ(cv ~RQ),@/OBBnj&[?B$ ڥ~ 4ES’8trrts.g8}VN`oe8Q uӶ@IN>=;y[҅vU wI^ag 4;g~o)NyP2*kVC UC+O\/*\]>}}i>PZ'7~~uܴl8wWqɧ6?gGxG()@ƽE?uN}8׍z>Q~Kj(]ƣEJv0ZY&)7UB5:e\:WLq]>]QSy^/55(T=L[wyu炙A3hʼ6Êoo,gknw1Ɗ6`pnդJ^7 HI8F߼`<:NᲬ ׅ7{lVwiAJ?6_hʘw7T@o~t"9fVP/_Q`~iNEkf;뢳RJFE' ("2lF)*1̎4PsTN׃^uHZ;V9*%B4n&5U3zrA<*K" )?ï&!h5\{¾n{ڮjH[ѧ$͕sʧ1|%.yߌw-͝[[UHi$#FoAs=Hh(ڞ䤴^;nxH_Ҥy=d ;zT }=Ch4{ow?׌ }V{GpNSk͹MhG9'i,U6Q9ml|ԓ뗱yrYq\8OjJ kQ {| LW厪; A"M\1ԆH8q١0`l1EI787Co=JHA˔A9"8/⠸ZASZ{Qf47x&pgm;4Rl![\ wjW^Lt~;g ŠN%XK-|>K8uP$reØhd U&Ws,D7Lvb٭gzfŠ#K~Њܬpx>J7 jz}6&Jzԍspk C+V%S0Pa"ХGV(ecLș+NERDOqtwKOkiq Buqa@.qmi9\ʰv_ojc_\rm?XK)K( @wKe}.A7a9}T3.5*t,ݗc}+]Q_t]ʟlhmh:_ݡ .N핯d8n/ a)і)B/Pz53S c2bsf\$4 j8V6$ eGOyӋsI,$ 5Ʌ&>LֆHI;ݒoMl֧Lʌ{`);حJnn蓛?Y Iu\QJld#[Myw#}٬Og-})5BS0(0 LPU9 UAdiJ@W, d6ARK<BYHqiΏKZ[] K¨Z9!]A"r#QN~JBhiA2]&.֥Diha*ZRq°ȜQȤČ6XэՖ)NiBɋ,d֌0fJb ;nnxɍa,5@B`0B5z^, -} YP!ݢ),kǗq9 }! ǐx1p&Y" ږqM6;K !i4Lѧ8U{NGJ 9!!1iI+:,r \s|c-$mQ{槍:oHqV)jDʠTZ_*^uP%)l\/XA<yec Mh9砞 Z>~>h3e2fQS^)u^iA +]189X 15n3!KW*h@);&`:|Zvv@Vۮ_"V[d+L V2N NId>sj3'3|D ڟ_@t1)q %j*g*. ;JާTYSa ='!a@>FYuG[ :esB,(v0llܺVzk5X `*eҙLrH*qQTi*/GZd|4mEϔX W8 $tVZ-TWW;\Ÿ`J]V2X#(X|*%ͪ56OtwMGUEi@PCdHhtCMsYӹJdeuLX2zWzYzћo }.t+OJ``xm RVEop1֠>j%y}_yf=7ގQw C3+~</lQjE~n>37/G-5G"UoU]U]}:<r%/LA{;_OJL&l L.C+c1v<E:trԺә]6{YMJ-c![6 Uq%JL?eCkdb:sy{b= % e"pëT9Eٛ[W)j߉tak*^5+A-J;gԂJtr&C#CN0{e+FuIg6dx &=ԡɖ7@K%2pm=~/'aT3PM'FTQw==宄RYdS7dTΦ+2,Z,4iJ$c-Tq&I#$>\ۨ?ZO<8s{+)aat xhj2!Q: Jl~ނη4=gvhcNt0;MNz${c{f0 %y&?>,7̫jsGp9> yB6F]"^<~E|E!+h %zsrrpG~O*@Iᰲ݃E\AZ}Y0,r+D?Pmlyz&HBB5z YHwO`Ez䟯iq(ȟ^>O*">㇈v3|3Ol^¹WC0A IL@P0ܒ^ך7bt<ٛ0FiBKM\2;,)溆nӷ#dD#U'WvRPeS&ZŦ`ђwwT|G)ﶳtptֽ[2K_V&ZŦW{?ZnR}nU11(v8,fx$ʶ[Zޭ MtcrguBbIMG,yo~*.p̥#Zk;?GMo< g_B)ІĄ1MLjePHB#ja8I☫cg$*F!4DBcb~e*&nw! 6$B-xFf Op$X$'! ,IP3LI UFA䒞(v]?:N5t75O#LHSwpH,M!ؤ6"&io1&bel q$K$Kÿ>Mt i)PJ$_";n^УԷrvl `<00ŶA9ek?{fN!pSH Tד$]͈)L$xNb(R2ĩbkpe)_" KBݽ wi)<J ӷncMPD(!yH ΃x%sqp/yPo\J T5E`A(X)ZJEE;y0h#؁6=gx67yZbIdG:U"1W2ba%W1M5Rqr)M"A}Av. CxoR!:͇uPKjO-/}PK#jID0QQe$QA)J#e"*"XX-DD*:1JYDRj(Q%1F  RH#Zsɲ(k $&`j&f!z!VX]AQ$)U{c9G0T!r.CNeK.mҌSUlci)Ytg+d?-ipDDC(:`WjR4Fő&t(11~gqMhE][鴽yhzA*[L)3קWɕ?:5︧yF2ʨs /p:t- psdG[7Q~lK_mؗ(rՒQTf~)pX'6tkm}ť j(H CYTw fT)ٔ[| 3Z Z~5c3(*bcMj+!5 Q{by.8Wghw⼻tʚ\[밐D+ٔtaKV/өRmqV ܠwpjD۔Ym4;LàGԼmoԶYAߪm00e/òڂk "hRLDU*k C@X[)$)\p3/Ju,.3&yڈ1agLT9JF$aX10doKҶPS+[]n){^˭݆<0oK$)A?Li%|-Ĉj @Iq˴ @qqlf2FkPWhmeiۏ`4iv*{/|Y ㅍP[2P9' Q NNo9㟗z=tdn-ttN!"a1i |h<"/F[uO$X)M"^'[xks\6FHhG 3Lxkت9$("<Ѧs9es41ׯ:wC1fD?t1+gׇ&Г'/u_G^`O yҚ֬5\ۨ?Z}YHI^R0 5H2ʝf^{].}0ٷ.f^nEȅzcǐr=s:Ʋ'?D@ߛ ju8B7k(XbZl_J!d{_S̄)~K:z :mA-eEjlu{uyJϙJ KV*Ci pqGݤ] U-~_+-1 ,H#TnoGk{3X6U%%zbƽf[^zxu-[崅aI ܳxSW7bBXJ1ӦZ# eZYF%6F nmo_~(@>["jIl\ZuE/gr>:dw͐/VFԜt뭻 Z"@WlXjM@j%e ]YVŅ R{V QF~NjI |;$Zw!kPudcO¿:,7*6Ũڏ]ʻ [U |L;xulZƼ[YOV&Z*kƹ\)>xNwnE(#kӻa!?.)ƚ3-k&kk~ +ɚYy2ѦqE'*=urif܎_ X6us;2:ǵu*Ԏ߀A= y:jIo8 O7I;x7@ޝ zQV B랛F=fU8 ,+\lvFܱ存A/NBy@~98՟4SC<~Vd2ҩG5qq4xj#4tK3Q?Ӱߏbo8tu. qw)}465 o'E4lg o~מM+~ޛ/6H!목iS%꽩`:6X ۺ(;HdI$_}ݙ?3 րݱR|//&xY=X[ :B 񸳩RM;Nv*i-Z0Φu>9ys؁p+:yjw>dQZr 5ńqSPG굒ۢ0/Z=A2N-yPY#û"U8']B3"'󯝬ۺU\Q;o8]Oy*P*dG@KjM\9/yV}yqVs= 6㾢Y/̾+&Py Y* 8s^I@DUoĮdr7oSR5%Px"ϓ{8\` {_$8Va8\]S&#u*WM)8rULWx }2HXP1R I[#o'{oV~'b:1E:4+i`OA=)[e`Q[]4'b]}! <<,j} RXL4zrbf,l!e/k L,ovqVc0>ThVm?J˥s4.U2%b05BLDZ ( G+zG=}czhj$v 6͡`Kvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004472127315144716154017717 0ustar rootrootFeb 16 21:37:58 crc systemd[1]: Starting Kubernetes Kubelet... Feb 16 21:37:58 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:58 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 16 21:37:59 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 16 21:37:59 crc kubenswrapper[4777]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.910847 4777 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919456 4777 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919499 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919513 4777 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919525 4777 feature_gate.go:330] unrecognized feature gate: Example Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919536 4777 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919547 4777 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919558 4777 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919569 4777 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919581 4777 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919592 4777 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919604 4777 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919615 4777 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919625 4777 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919638 4777 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919649 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919661 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919671 4777 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919682 4777 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919692 4777 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919704 4777 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919758 4777 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919770 4777 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919783 4777 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919821 4777 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919837 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919853 4777 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919867 4777 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919879 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919924 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919939 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919951 4777 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919962 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919972 4777 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919983 4777 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.919994 4777 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920004 4777 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920015 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920025 4777 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920036 4777 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920047 4777 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920058 4777 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920070 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920080 4777 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920091 4777 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920107 4777 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920121 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920134 4777 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920146 4777 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920158 4777 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920171 4777 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920186 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920197 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920208 4777 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920219 4777 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920229 4777 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920241 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920252 4777 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920263 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920275 4777 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920287 4777 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920298 4777 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920308 4777 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920319 4777 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920330 4777 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920344 4777 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920354 4777 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920370 4777 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920383 4777 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920395 4777 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920406 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.920417 4777 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922099 4777 flags.go:64] FLAG: --address="0.0.0.0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922135 4777 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922161 4777 flags.go:64] FLAG: --anonymous-auth="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922178 4777 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922194 4777 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922208 4777 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922226 4777 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922241 4777 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922254 4777 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922267 4777 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922282 4777 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922295 4777 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922309 4777 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922322 4777 flags.go:64] FLAG: --cgroup-root="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922334 4777 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922347 4777 flags.go:64] FLAG: --client-ca-file="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922360 4777 flags.go:64] FLAG: --cloud-config="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922372 4777 flags.go:64] FLAG: --cloud-provider="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922384 4777 flags.go:64] FLAG: --cluster-dns="[]" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922403 4777 flags.go:64] FLAG: --cluster-domain="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922415 4777 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922429 4777 flags.go:64] FLAG: --config-dir="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922441 4777 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922455 4777 flags.go:64] FLAG: --container-log-max-files="5" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922472 4777 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922485 4777 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922498 4777 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922512 4777 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922524 4777 flags.go:64] FLAG: --contention-profiling="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922562 4777 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922576 4777 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922590 4777 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922603 4777 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922619 4777 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922632 4777 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922645 4777 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922658 4777 flags.go:64] FLAG: --enable-load-reader="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922671 4777 flags.go:64] FLAG: --enable-server="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922683 4777 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922704 4777 flags.go:64] FLAG: --event-burst="100" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922755 4777 flags.go:64] FLAG: --event-qps="50" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922766 4777 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922777 4777 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922787 4777 flags.go:64] FLAG: --eviction-hard="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922800 4777 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922810 4777 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922819 4777 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922831 4777 flags.go:64] FLAG: --eviction-soft="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922840 4777 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922850 4777 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922859 4777 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922871 4777 flags.go:64] FLAG: --experimental-mounter-path="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922881 4777 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922891 4777 flags.go:64] FLAG: --fail-swap-on="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922901 4777 flags.go:64] FLAG: --feature-gates="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922914 4777 flags.go:64] FLAG: --file-check-frequency="20s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922924 4777 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922934 4777 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922944 4777 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922954 4777 flags.go:64] FLAG: --healthz-port="10248" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922965 4777 flags.go:64] FLAG: --help="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922974 4777 flags.go:64] FLAG: --hostname-override="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922984 4777 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.922994 4777 flags.go:64] FLAG: --http-check-frequency="20s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923004 4777 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923027 4777 flags.go:64] FLAG: --image-credential-provider-config="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923037 4777 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923047 4777 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923057 4777 flags.go:64] FLAG: --image-service-endpoint="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923067 4777 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923076 4777 flags.go:64] FLAG: --kube-api-burst="100" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923086 4777 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923097 4777 flags.go:64] FLAG: --kube-api-qps="50" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923107 4777 flags.go:64] FLAG: --kube-reserved="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923117 4777 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923126 4777 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923136 4777 flags.go:64] FLAG: --kubelet-cgroups="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923145 4777 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923155 4777 flags.go:64] FLAG: --lock-file="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923165 4777 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923175 4777 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923184 4777 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923209 4777 flags.go:64] FLAG: --log-json-split-stream="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923220 4777 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923230 4777 flags.go:64] FLAG: --log-text-split-stream="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923240 4777 flags.go:64] FLAG: --logging-format="text" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923250 4777 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923260 4777 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923270 4777 flags.go:64] FLAG: --manifest-url="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923281 4777 flags.go:64] FLAG: --manifest-url-header="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923295 4777 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923307 4777 flags.go:64] FLAG: --max-open-files="1000000" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923323 4777 flags.go:64] FLAG: --max-pods="110" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923336 4777 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923349 4777 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923361 4777 flags.go:64] FLAG: --memory-manager-policy="None" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923374 4777 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923387 4777 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923399 4777 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923411 4777 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923434 4777 flags.go:64] FLAG: --node-status-max-images="50" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923446 4777 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923456 4777 flags.go:64] FLAG: --oom-score-adj="-999" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923466 4777 flags.go:64] FLAG: --pod-cidr="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923476 4777 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923491 4777 flags.go:64] FLAG: --pod-manifest-path="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923501 4777 flags.go:64] FLAG: --pod-max-pids="-1" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923511 4777 flags.go:64] FLAG: --pods-per-core="0" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923521 4777 flags.go:64] FLAG: --port="10250" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923532 4777 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923541 4777 flags.go:64] FLAG: --provider-id="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923550 4777 flags.go:64] FLAG: --qos-reserved="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923560 4777 flags.go:64] FLAG: --read-only-port="10255" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923570 4777 flags.go:64] FLAG: --register-node="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923580 4777 flags.go:64] FLAG: --register-schedulable="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923590 4777 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923606 4777 flags.go:64] FLAG: --registry-burst="10" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923616 4777 flags.go:64] FLAG: --registry-qps="5" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923626 4777 flags.go:64] FLAG: --reserved-cpus="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923636 4777 flags.go:64] FLAG: --reserved-memory="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923648 4777 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923657 4777 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923667 4777 flags.go:64] FLAG: --rotate-certificates="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923677 4777 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923687 4777 flags.go:64] FLAG: --runonce="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923697 4777 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923707 4777 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923748 4777 flags.go:64] FLAG: --seccomp-default="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923758 4777 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923768 4777 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923779 4777 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923788 4777 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923799 4777 flags.go:64] FLAG: --storage-driver-password="root" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923809 4777 flags.go:64] FLAG: --storage-driver-secure="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923854 4777 flags.go:64] FLAG: --storage-driver-table="stats" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923865 4777 flags.go:64] FLAG: --storage-driver-user="root" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923875 4777 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923888 4777 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923898 4777 flags.go:64] FLAG: --system-cgroups="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923907 4777 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923923 4777 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923932 4777 flags.go:64] FLAG: --tls-cert-file="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923943 4777 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923955 4777 flags.go:64] FLAG: --tls-min-version="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923965 4777 flags.go:64] FLAG: --tls-private-key-file="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923974 4777 flags.go:64] FLAG: --topology-manager-policy="none" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923984 4777 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.923994 4777 flags.go:64] FLAG: --topology-manager-scope="container" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.924004 4777 flags.go:64] FLAG: --v="2" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.924017 4777 flags.go:64] FLAG: --version="false" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.924030 4777 flags.go:64] FLAG: --vmodule="" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.924042 4777 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.924052 4777 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924270 4777 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924281 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924291 4777 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924301 4777 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924311 4777 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924322 4777 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924332 4777 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924346 4777 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924360 4777 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924373 4777 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924384 4777 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924395 4777 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924404 4777 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924413 4777 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924422 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924430 4777 feature_gate.go:330] unrecognized feature gate: Example Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924438 4777 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924447 4777 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924455 4777 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924465 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924475 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924492 4777 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924502 4777 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924511 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924520 4777 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924529 4777 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924537 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924546 4777 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924554 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924563 4777 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924572 4777 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924580 4777 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924589 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924598 4777 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924607 4777 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924615 4777 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924624 4777 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924632 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924641 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924649 4777 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924658 4777 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924667 4777 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924675 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924699 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924707 4777 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924752 4777 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924761 4777 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924770 4777 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924778 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924787 4777 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924796 4777 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924804 4777 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924812 4777 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924858 4777 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924870 4777 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924880 4777 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924893 4777 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924904 4777 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924913 4777 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924922 4777 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924931 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924941 4777 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924951 4777 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924960 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924968 4777 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924977 4777 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924986 4777 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.924994 4777 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.925002 4777 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.925011 4777 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.925019 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.926900 4777 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.940225 4777 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.940287 4777 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940433 4777 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940448 4777 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940456 4777 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940466 4777 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940474 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940483 4777 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940492 4777 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940501 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940512 4777 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940522 4777 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940532 4777 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940540 4777 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940548 4777 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940556 4777 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940563 4777 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940571 4777 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940580 4777 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940588 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940596 4777 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940603 4777 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940611 4777 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940619 4777 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940627 4777 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940635 4777 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940642 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940650 4777 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940658 4777 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940666 4777 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940676 4777 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940684 4777 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940692 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940700 4777 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940708 4777 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940749 4777 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940758 4777 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940767 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940777 4777 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940786 4777 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940794 4777 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940802 4777 feature_gate.go:330] unrecognized feature gate: Example Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940810 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940818 4777 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940827 4777 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940834 4777 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940842 4777 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940851 4777 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940858 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940868 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940877 4777 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940885 4777 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940894 4777 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940902 4777 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940910 4777 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940919 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940927 4777 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940934 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940942 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940954 4777 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940967 4777 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940976 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940985 4777 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.940994 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941002 4777 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941011 4777 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941021 4777 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941032 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941041 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941052 4777 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941062 4777 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941071 4777 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941080 4777 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.941094 4777 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941330 4777 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941345 4777 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941353 4777 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941361 4777 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941369 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941377 4777 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941385 4777 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941393 4777 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941400 4777 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941408 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941416 4777 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941424 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941432 4777 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941440 4777 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941449 4777 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941460 4777 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941470 4777 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941480 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941488 4777 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941499 4777 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941508 4777 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941518 4777 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941526 4777 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941534 4777 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941542 4777 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941550 4777 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941558 4777 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941566 4777 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941573 4777 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941581 4777 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941589 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941598 4777 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941607 4777 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941614 4777 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941622 4777 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941629 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941638 4777 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941646 4777 feature_gate.go:330] unrecognized feature gate: Example Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941655 4777 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941663 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941671 4777 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941679 4777 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941687 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941697 4777 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941707 4777 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941741 4777 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941750 4777 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941758 4777 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941767 4777 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941776 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941785 4777 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941792 4777 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941800 4777 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941811 4777 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941820 4777 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941828 4777 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941836 4777 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941844 4777 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941852 4777 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941860 4777 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941868 4777 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941876 4777 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941884 4777 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941892 4777 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941900 4777 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941908 4777 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941916 4777 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941924 4777 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941932 4777 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941940 4777 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 16 21:37:59 crc kubenswrapper[4777]: W0216 21:37:59.941948 4777 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.941961 4777 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.943295 4777 server.go:940] "Client rotation is on, will bootstrap in background" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.949959 4777 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.950150 4777 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.952186 4777 server.go:997] "Starting client certificate rotation" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.952248 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.953628 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-17 04:45:00.588225389 +0000 UTC Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.953833 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.977294 4777 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.980788 4777 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 16 21:37:59 crc kubenswrapper[4777]: E0216 21:37:59.983810 4777 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:37:59 crc kubenswrapper[4777]: I0216 21:37:59.999869 4777 log.go:25] "Validated CRI v1 runtime API" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.039758 4777 log.go:25] "Validated CRI v1 image API" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.042473 4777 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.048694 4777 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-16-21-33-11-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.048775 4777 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.078600 4777 manager.go:217] Machine: {Timestamp:2026-02-16 21:38:00.074870821 +0000 UTC m=+0.657372003 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4b23e571-272d-4c87-821c-0e1a2dceb613 BootID:dd88e8c4-6b8b-421e-820d-c535c131d8af Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:89:4b:1e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:89:4b:1e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:16:9a:e7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:88:ab:1b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c1:5e:83 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7a:6e:15 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c6:e5:40:85:68:c6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:f0:07:1e:9a:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.079054 4777 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.079385 4777 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.080811 4777 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.081151 4777 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.081202 4777 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.081554 4777 topology_manager.go:138] "Creating topology manager with none policy" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.081573 4777 container_manager_linux.go:303] "Creating device plugin manager" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.082181 4777 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.082232 4777 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.082468 4777 state_mem.go:36] "Initialized new in-memory state store" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.082615 4777 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.086063 4777 kubelet.go:418] "Attempting to sync node with API server" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.086097 4777 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.086139 4777 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.086162 4777 kubelet.go:324] "Adding apiserver pod source" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.086182 4777 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.090252 4777 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.091606 4777 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.091567 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.091689 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.092081 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.092192 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.093749 4777 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095525 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095575 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095594 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095616 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095645 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095679 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095697 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095774 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095796 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095813 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095859 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095880 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.095923 4777 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.096699 4777 server.go:1280] "Started kubelet" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.097857 4777 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.097875 4777 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.099038 4777 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.099555 4777 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 16 21:38:00 crc systemd[1]: Started Kubernetes Kubelet. Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.101672 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.101796 4777 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.101868 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 02:46:41.687946964 +0000 UTC Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.102130 4777 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.102164 4777 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.102325 4777 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.102657 4777 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.103564 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.103685 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.104071 4777 factory.go:55] Registering systemd factory Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.109317 4777 factory.go:221] Registration of the systemd container factory successfully Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.103986 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="200ms" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.110545 4777 factory.go:153] Registering CRI-O factory Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.110601 4777 factory.go:221] Registration of the crio container factory successfully Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.110758 4777 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.110803 4777 factory.go:103] Registering Raw factory Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.110838 4777 manager.go:1196] Started watching for new ooms in manager Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.111625 4777 server.go:460] "Adding debug handlers to kubelet server" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.112320 4777 manager.go:319] Starting recovery of all containers Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.112507 4777 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.203:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1894d7d8e9aac77e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-16 21:38:00.096638846 +0000 UTC m=+0.679139988,LastTimestamp:2026-02-16 21:38:00.096638846 +0000 UTC m=+0.679139988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127245 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127339 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127372 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127392 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127414 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127433 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127454 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127473 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127545 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127591 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127619 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127640 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127663 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127707 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127773 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127793 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127817 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127836 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127854 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127873 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127893 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127912 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127945 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127966 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.127985 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128006 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128027 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128047 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128069 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128088 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128114 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128132 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128218 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128236 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128254 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128274 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128328 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128347 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128365 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128384 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128403 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128423 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128442 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128461 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128481 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128499 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128521 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128541 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128563 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128586 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128607 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128628 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128658 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128683 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128703 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128799 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128821 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128842 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128860 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128878 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128898 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128915 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128932 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128951 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128974 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.128997 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129025 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129049 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129077 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129101 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129128 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129151 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129223 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129250 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129276 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129303 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129330 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129354 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129382 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129403 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129425 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129445 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129464 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129482 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129504 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129522 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129541 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129559 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129578 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129596 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129613 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129632 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129652 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129675 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129692 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129711 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129809 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129829 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129847 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129866 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129885 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129906 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129925 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129944 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129970 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.129990 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130051 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130073 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130095 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130116 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130139 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130160 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130182 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130204 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130223 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130241 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130261 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130279 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130298 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130318 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130339 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130358 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130384 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130410 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130436 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130464 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130488 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130506 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130529 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130551 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130577 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130601 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130625 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130646 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130668 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130690 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130710 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130762 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130782 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130801 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130823 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130843 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130869 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130889 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130914 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130932 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130949 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130969 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.130990 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131037 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131056 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131076 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131094 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131112 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131130 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131148 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131167 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131188 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131210 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131231 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131253 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131274 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131293 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131312 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131332 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131351 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131371 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131391 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131411 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131429 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131447 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131466 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131484 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131502 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131522 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131541 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131562 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131583 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131601 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.131622 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133656 4777 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133697 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133752 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133777 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133796 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133816 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133836 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133855 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133876 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133895 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133914 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133936 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133957 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133976 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.133993 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134011 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134028 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134049 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134067 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134085 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134103 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134121 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134138 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134155 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134174 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134195 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134215 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134232 4777 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134252 4777 reconstruct.go:97] "Volume reconstruction finished" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.134265 4777 reconciler.go:26] "Reconciler: start to sync state" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.144561 4777 manager.go:324] Recovery completed Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.154641 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.159216 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.159302 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.160619 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.162623 4777 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.162664 4777 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.162705 4777 state_mem.go:36] "Initialized new in-memory state store" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.174686 4777 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.179158 4777 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.179354 4777 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.179527 4777 kubelet.go:2335] "Starting kubelet main sync loop" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.180594 4777 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.180444 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.180959 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.182398 4777 policy_none.go:49] "None policy: Start" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.183289 4777 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.183354 4777 state_mem.go:35] "Initializing new in-memory state store" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.203793 4777 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.268766 4777 manager.go:334] "Starting Device Plugin manager" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.269062 4777 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.269089 4777 server.go:79] "Starting device plugin registration server" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.269613 4777 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.269660 4777 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.269927 4777 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.270261 4777 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.270410 4777 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.281279 4777 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.281398 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.281418 4777 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.282816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.282865 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.282882 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.283061 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.283497 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.283608 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284177 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284211 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284226 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284341 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284564 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.284662 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285186 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285218 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285234 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285268 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285339 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285357 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285375 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285546 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.285641 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286266 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286341 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286371 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286378 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286426 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286451 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.286832 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287014 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287094 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287324 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287374 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287393 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.287990 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288036 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288055 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288302 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288355 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288366 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288395 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.288411 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.291186 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.291227 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.291244 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.310763 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="400ms" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.337808 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.337850 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.337879 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.337905 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.337964 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338020 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338064 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338117 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338161 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338193 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338225 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338256 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338282 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338323 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.338421 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.370259 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.371788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.371865 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.371886 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.371933 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.372717 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.203:6443: connect: connection refused" node="crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439356 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439427 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439462 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439493 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439525 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439565 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439580 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439600 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439622 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439683 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439758 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439697 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439803 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439698 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439818 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439824 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439877 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439897 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439885 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.439968 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440021 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440079 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440107 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440128 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440179 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440227 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440248 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440301 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440374 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.440420 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.573469 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.575578 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.575640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.575662 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.575748 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.576548 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.203:6443: connect: connection refused" node="crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.633609 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.642196 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.660964 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.680537 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.682552 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0ecff9ee2dfdbcf2610d2dddbcd8e95d8294e36e2b2ddb35f72075f1eed1f585 WatchSource:0}: Error finding container 0ecff9ee2dfdbcf2610d2dddbcd8e95d8294e36e2b2ddb35f72075f1eed1f585: Status 404 returned error can't find the container with id 0ecff9ee2dfdbcf2610d2dddbcd8e95d8294e36e2b2ddb35f72075f1eed1f585 Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.685705 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5ba847ad02a86f7b62d13a9dfee717b0cfb88dc9da435903d7af880db115d013 WatchSource:0}: Error finding container 5ba847ad02a86f7b62d13a9dfee717b0cfb88dc9da435903d7af880db115d013: Status 404 returned error can't find the container with id 5ba847ad02a86f7b62d13a9dfee717b0cfb88dc9da435903d7af880db115d013 Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.688139 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.694148 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fa5ec0acfd33ae2b0d4c525f5b34b85efbd2ce2d8d4336f4578ec0e9f3937d97 WatchSource:0}: Error finding container fa5ec0acfd33ae2b0d4c525f5b34b85efbd2ce2d8d4336f4578ec0e9f3937d97: Status 404 returned error can't find the container with id fa5ec0acfd33ae2b0d4c525f5b34b85efbd2ce2d8d4336f4578ec0e9f3937d97 Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.706917 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ab0b78c038a6b55d1241fe848e3e6c9fce491445d4ea9c84824e64f0b37f038d WatchSource:0}: Error finding container ab0b78c038a6b55d1241fe848e3e6c9fce491445d4ea9c84824e64f0b37f038d: Status 404 returned error can't find the container with id ab0b78c038a6b55d1241fe848e3e6c9fce491445d4ea9c84824e64f0b37f038d Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.712508 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="800ms" Feb 16 21:38:00 crc kubenswrapper[4777]: W0216 21:38:00.715815 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-65a6ef016dce0959ce258cabd88487c6f4c9c30890a5037532e7ab5f9214652b WatchSource:0}: Error finding container 65a6ef016dce0959ce258cabd88487c6f4c9c30890a5037532e7ab5f9214652b: Status 404 returned error can't find the container with id 65a6ef016dce0959ce258cabd88487c6f4c9c30890a5037532e7ab5f9214652b Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.977533 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.980225 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.980270 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.980282 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:00 crc kubenswrapper[4777]: I0216 21:38:00.980311 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:00 crc kubenswrapper[4777]: E0216 21:38:00.980997 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.203:6443: connect: connection refused" node="crc" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.100073 4777 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.103207 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:58:19.238775472 +0000 UTC Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.185264 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ecff9ee2dfdbcf2610d2dddbcd8e95d8294e36e2b2ddb35f72075f1eed1f585"} Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.187428 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65a6ef016dce0959ce258cabd88487c6f4c9c30890a5037532e7ab5f9214652b"} Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.188578 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab0b78c038a6b55d1241fe848e3e6c9fce491445d4ea9c84824e64f0b37f038d"} Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.189890 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fa5ec0acfd33ae2b0d4c525f5b34b85efbd2ce2d8d4336f4578ec0e9f3937d97"} Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.190901 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ba847ad02a86f7b62d13a9dfee717b0cfb88dc9da435903d7af880db115d013"} Feb 16 21:38:01 crc kubenswrapper[4777]: W0216 21:38:01.237991 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.238080 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:01 crc kubenswrapper[4777]: W0216 21:38:01.249098 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.249182 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:01 crc kubenswrapper[4777]: W0216 21:38:01.460769 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.460916 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:01 crc kubenswrapper[4777]: W0216 21:38:01.474687 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.474790 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.513279 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="1.6s" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.781779 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.783219 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.783260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.783272 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:01 crc kubenswrapper[4777]: I0216 21:38:01.783298 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:01 crc kubenswrapper[4777]: E0216 21:38:01.783921 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.203:6443: connect: connection refused" node="crc" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.101489 4777 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.103630 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 05:52:51.790245363 +0000 UTC Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.106910 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 16 21:38:02 crc kubenswrapper[4777]: E0216 21:38:02.108647 4777 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.198517 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e1e8a2fbc30814f759055809dfd0c5a218aea393296d6748a612affc728fc50c"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.198396 4777 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e1e8a2fbc30814f759055809dfd0c5a218aea393296d6748a612affc728fc50c" exitCode=0 Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.201712 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.205352 4777 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd" exitCode=0 Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.205453 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.205581 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.206915 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.206970 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.206989 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.207553 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.207754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.207926 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.210271 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.210297 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.210312 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.213072 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae" exitCode=0 Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.213341 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.213368 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.214962 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.214993 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.215009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.216024 4777 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="426564065ad1b45e7ee00c8f8efb4a1e75c36f6ef5e93eea38310bbc8a694052" exitCode=0 Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.216080 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"426564065ad1b45e7ee00c8f8efb4a1e75c36f6ef5e93eea38310bbc8a694052"} Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.216116 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.217947 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.217982 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.217994 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.220665 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.222134 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.222300 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:02 crc kubenswrapper[4777]: I0216 21:38:02.222421 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:02 crc kubenswrapper[4777]: W0216 21:38:02.893309 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:02 crc kubenswrapper[4777]: E0216 21:38:02.893395 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.100664 4777 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.104071 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 03:16:16.177581249 +0000 UTC Feb 16 21:38:03 crc kubenswrapper[4777]: E0216 21:38:03.114306 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="3.2s" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.230827 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"74eae1edeef1eec3ff139d9c4fefbdc0185b16eab1f433889ad340496d348a99"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.230913 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.232207 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.232260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.232278 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.246170 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.246275 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.246294 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.246319 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.248667 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.248741 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.248755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.260147 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.260191 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.261634 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.261664 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.261674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.263973 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.264003 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.264018 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.264030 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.265650 4777 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="15327315956a43055a0483d8c8b3ee6918334b8a1af55541cd5a206538c18083" exitCode=0 Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.265685 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"15327315956a43055a0483d8c8b3ee6918334b8a1af55541cd5a206538c18083"} Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.265810 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.266591 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.266632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.266649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.384288 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.385793 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.385832 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.385845 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:03 crc kubenswrapper[4777]: I0216 21:38:03.385873 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:03 crc kubenswrapper[4777]: E0216 21:38:03.386409 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.203:6443: connect: connection refused" node="crc" Feb 16 21:38:03 crc kubenswrapper[4777]: W0216 21:38:03.701127 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.203:6443: connect: connection refused Feb 16 21:38:03 crc kubenswrapper[4777]: E0216 21:38:03.701288 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.203:6443: connect: connection refused" logger="UnhandledError" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.104382 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:22:24.164506482 +0000 UTC Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.271614 4777 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7ba8c697391c6ca0eaf4a50860ecf0c3f5f10d8c55f17e24bf72dbd2e1c55c28" exitCode=0 Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.271776 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7ba8c697391c6ca0eaf4a50860ecf0c3f5f10d8c55f17e24bf72dbd2e1c55c28"} Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.271840 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.273947 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.274002 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.274041 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.276973 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.276975 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b"} Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.277067 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.276974 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.277012 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.276974 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278628 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278653 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278694 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278703 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278663 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278765 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278783 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278712 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278961 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.279022 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.278767 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.410203 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:04 crc kubenswrapper[4777]: I0216 21:38:04.554044 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.104605 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 08:54:05.18772312 +0000 UTC Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284406 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284689 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"170aa70915c75a1fb2b3577b20f67cd63665b6f111d74aef8be7d7f1ec1e32b7"} Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284774 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a05f57e1805d6be1c3aaa5fa6fcda9b298f5f96f345e2a4a93e9aae84e57d4ae"} Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284831 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72bc7f6014301bf462c2b73412471e51dbf7257eb60aad20d744cbbf63e43bff"} Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284931 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.284954 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285846 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285909 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285911 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285959 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285978 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285990 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.285980 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.286032 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.550757 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:05 crc kubenswrapper[4777]: I0216 21:38:05.644031 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.105798 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 11:03:13.251638064 +0000 UTC Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.296021 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a45c67f5618733f9019e49fdd14c469083827908a73536ac8f0c82f06108836"} Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.296107 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"703ee4fe04715db00ea890a330a95bbaa714e9c4dc145b30799824c233891d5c"} Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.296115 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.296113 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297593 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297642 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297660 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297861 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297915 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.297938 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.377409 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.586524 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.588670 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.588778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.588798 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.588834 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.785676 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.785889 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.787626 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.787692 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.787759 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:06 crc kubenswrapper[4777]: I0216 21:38:06.795291 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.106428 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 03:08:35.082174631 +0000 UTC Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.299537 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.299615 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.299555 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.301811 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.301869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.301892 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302297 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302353 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302377 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302420 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302469 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.302486 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.515060 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.554507 4777 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 16 21:38:07 crc kubenswrapper[4777]: I0216 21:38:07.554614 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.107519 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:04:49.731224465 +0000 UTC Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.303186 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.304784 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.304855 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.304872 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.864790 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.865284 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.867159 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.867355 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.867493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:08 crc kubenswrapper[4777]: I0216 21:38:08.889373 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.107773 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:32:19.318272508 +0000 UTC Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.306003 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.307466 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.307666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.307856 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.767071 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.767352 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.769320 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.769384 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:09 crc kubenswrapper[4777]: I0216 21:38:09.769402 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:10 crc kubenswrapper[4777]: I0216 21:38:10.108032 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 05:39:00.219743778 +0000 UTC Feb 16 21:38:10 crc kubenswrapper[4777]: E0216 21:38:10.281576 4777 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 16 21:38:11 crc kubenswrapper[4777]: I0216 21:38:11.108766 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 14:58:34.108550125 +0000 UTC Feb 16 21:38:12 crc kubenswrapper[4777]: I0216 21:38:12.109573 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 07:53:48.889994394 +0000 UTC Feb 16 21:38:13 crc kubenswrapper[4777]: I0216 21:38:13.110356 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:20:09.055090483 +0000 UTC Feb 16 21:38:13 crc kubenswrapper[4777]: W0216 21:38:13.861533 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 16 21:38:13 crc kubenswrapper[4777]: I0216 21:38:13.861665 4777 trace.go:236] Trace[869646176]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 21:38:03.859) (total time: 10002ms): Feb 16 21:38:13 crc kubenswrapper[4777]: Trace[869646176]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (21:38:13.861) Feb 16 21:38:13 crc kubenswrapper[4777]: Trace[869646176]: [10.002160089s] [10.002160089s] END Feb 16 21:38:13 crc kubenswrapper[4777]: E0216 21:38:13.861698 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 16 21:38:13 crc kubenswrapper[4777]: W0216 21:38:13.921142 4777 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 16 21:38:13 crc kubenswrapper[4777]: I0216 21:38:13.921259 4777 trace.go:236] Trace[1723360524]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 21:38:03.919) (total time: 10001ms): Feb 16 21:38:13 crc kubenswrapper[4777]: Trace[1723360524]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:38:13.921) Feb 16 21:38:13 crc kubenswrapper[4777]: Trace[1723360524]: [10.001452509s] [10.001452509s] END Feb 16 21:38:13 crc kubenswrapper[4777]: E0216 21:38:13.921287 4777 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.101389 4777 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.110574 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:25:37.057804744 +0000 UTC Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.649347 4777 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.649453 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.657805 4777 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 16 21:38:14 crc kubenswrapper[4777]: I0216 21:38:14.657961 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 16 21:38:15 crc kubenswrapper[4777]: I0216 21:38:15.110955 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:39:12.961520525 +0000 UTC Feb 16 21:38:15 crc kubenswrapper[4777]: I0216 21:38:15.560789 4777 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]log ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]etcd ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/generic-apiserver-start-informers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/priority-and-fairness-filter ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-apiextensions-informers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-apiextensions-controllers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/crd-informer-synced ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-system-namespaces-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 16 21:38:15 crc kubenswrapper[4777]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/bootstrap-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/start-kube-aggregator-informers ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-registration-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-discovery-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]autoregister-completion ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-openapi-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 16 21:38:15 crc kubenswrapper[4777]: livez check failed Feb 16 21:38:15 crc kubenswrapper[4777]: I0216 21:38:15.560917 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:38:16 crc kubenswrapper[4777]: I0216 21:38:16.111459 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 00:23:29.787900262 +0000 UTC Feb 16 21:38:17 crc kubenswrapper[4777]: I0216 21:38:17.112158 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 16:56:05.338062083 +0000 UTC Feb 16 21:38:17 crc kubenswrapper[4777]: I0216 21:38:17.555326 4777 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 16 21:38:17 crc kubenswrapper[4777]: I0216 21:38:17.555453 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.112265 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:01:20.186182109 +0000 UTC Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.872319 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.872579 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.874436 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.874505 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.874526 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.925279 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.925867 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.927459 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.927503 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.927516 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:18 crc kubenswrapper[4777]: I0216 21:38:18.943271 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.046590 4777 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.113201 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:08:59.565314309 +0000 UTC Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.339300 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.340496 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.340551 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.340562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:19 crc kubenswrapper[4777]: E0216 21:38:19.645481 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.651572 4777 trace.go:236] Trace[1755118039]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 21:38:09.626) (total time: 10025ms): Feb 16 21:38:19 crc kubenswrapper[4777]: Trace[1755118039]: ---"Objects listed" error: 10024ms (21:38:19.651) Feb 16 21:38:19 crc kubenswrapper[4777]: Trace[1755118039]: [10.025086668s] [10.025086668s] END Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.651598 4777 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.653312 4777 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.656149 4777 trace.go:236] Trace[1182400120]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Feb-2026 21:38:07.953) (total time: 11700ms): Feb 16 21:38:19 crc kubenswrapper[4777]: Trace[1182400120]: ---"Objects listed" error: 11700ms (21:38:19.654) Feb 16 21:38:19 crc kubenswrapper[4777]: Trace[1182400120]: [11.700930506s] [11.700930506s] END Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.656193 4777 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 16 21:38:19 crc kubenswrapper[4777]: E0216 21:38:19.656614 4777 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.661250 4777 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.927483 4777 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36172->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 16 21:38:19 crc kubenswrapper[4777]: I0216 21:38:19.927553 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:36172->192.168.126.11:17697: read: connection reset by peer" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.095963 4777 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.100923 4777 apiserver.go:52] "Watching apiserver" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.104236 4777 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.104439 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.104806 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.104891 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.104979 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.105175 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.105774 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.105790 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.105875 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.105768 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.105903 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.107591 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.107672 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.107907 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.108039 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.108918 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.109160 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.111340 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.111377 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.111422 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.113293 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:50:58.475307934 +0000 UTC Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.145280 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.158769 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.172496 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.185409 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.203344 4777 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.205214 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.222529 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.242655 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256307 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256364 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256389 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256413 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256434 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256459 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256482 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256504 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256528 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256552 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256574 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256597 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256617 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256645 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256667 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256688 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256724 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256749 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256769 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256767 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256811 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256767 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256832 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256898 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256929 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256969 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256989 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257010 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257045 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257069 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257092 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257113 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257159 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257182 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257203 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257227 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257249 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257271 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257297 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257318 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257340 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257363 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257385 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257407 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257431 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257452 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.256812 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257011 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257526 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257079 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257192 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257221 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257301 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257412 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257446 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257590 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257686 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257795 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257808 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257824 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257855 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257898 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258012 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258066 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258073 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.257477 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258177 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258191 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258440 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258204 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258252 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258263 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258285 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258425 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258612 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258612 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258678 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258206 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258747 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258756 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258777 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258803 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258828 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258832 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258851 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258873 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258894 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258916 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258939 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258961 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258984 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259008 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259029 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259051 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259073 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259098 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259120 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259142 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259167 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259190 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259213 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259238 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259262 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259287 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259311 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259337 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259361 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259384 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259406 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259431 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259454 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259478 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.260924 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.260980 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261017 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261045 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261079 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261109 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261135 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261167 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261198 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261230 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261261 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261295 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261329 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262608 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262655 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262690 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262736 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262771 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262803 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262833 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262864 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262899 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262999 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263034 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263068 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263107 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263250 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263296 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263336 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263374 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263409 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263445 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263472 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263504 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263538 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263567 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263600 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263637 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263671 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263706 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263759 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263791 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263820 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263853 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263886 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263931 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.263968 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264002 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264035 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264062 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264093 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264122 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264148 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264182 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264217 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264242 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264276 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264309 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264337 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264366 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264414 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264450 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264476 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264509 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264541 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264572 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264603 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264638 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264671 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264698 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264748 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264790 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264828 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264871 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264908 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264943 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.264971 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265001 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265032 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265059 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265090 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265121 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265152 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265179 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265240 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265293 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265626 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265754 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265797 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265856 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265919 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265972 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266016 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266053 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266085 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266123 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266167 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266196 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266237 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266281 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266315 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266356 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266407 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266448 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266479 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266514 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266553 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266587 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266624 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266664 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266690 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266749 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266788 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266823 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266855 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266892 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266998 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267066 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267116 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267159 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267196 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267241 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267278 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267326 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267367 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267402 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267444 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267484 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267525 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267557 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267957 4777 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267995 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268011 4777 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268034 4777 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268050 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268063 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268074 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268090 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268101 4777 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268111 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268126 4777 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268147 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268167 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268180 4777 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268201 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268214 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268224 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268235 4777 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268249 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268259 4777 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268269 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268280 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268290 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268301 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268311 4777 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268323 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268333 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268343 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268354 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268366 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268376 4777 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268387 4777 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268400 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258848 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275052 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258851 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258865 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.258981 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259035 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259078 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259252 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259385 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259457 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261030 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261192 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261209 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261204 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261412 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261432 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261641 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261646 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261788 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261900 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261991 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.261993 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262001 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262088 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262134 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262372 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.262180 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265447 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.265582 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266072 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.259464 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266213 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266232 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266903 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266897 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.266914 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267044 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267375 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267557 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267677 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267705 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.267984 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.267997 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268194 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268381 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268410 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268525 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268611 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.268821 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269275 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269544 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269539 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269647 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269666 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269661 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269818 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269898 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269938 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269933 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269244 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270351 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270317 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.269619 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270146 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270133 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270754 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271275 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271486 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271501 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271519 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271925 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.271505 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.272123 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.273102 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.273328 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.270609 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274107 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274242 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.273060 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.273064 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274392 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274382 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274428 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274459 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274473 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274523 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274542 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274584 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274689 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274804 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274814 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.274835 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275121 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275208 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275249 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275869 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275909 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275934 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275965 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276025 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.276073 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:20.776026009 +0000 UTC m=+21.358527191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276163 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276225 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276205 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.275659 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276245 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276356 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276389 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276532 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.276626 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.277064 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:20.777032347 +0000 UTC m=+21.359533619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.277622 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.278382 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.278806 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.278823 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.278846 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279081 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279199 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279502 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279664 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279852 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.279918 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.280182 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.280470 4777 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.280835 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.280988 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.281298 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.281372 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.281432 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.282044 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.282075 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.282172 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.282325 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.282906 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.283050 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.283147 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.283169 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.283984 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.285562 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.285791 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:20.785709395 +0000 UTC m=+21.368210697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.286686 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.294210 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.294238 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.294622 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.296024 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.296965 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.296994 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.297016 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.297028 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.297102 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:20.797073866 +0000 UTC m=+21.379575178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.297858 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.298284 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.298397 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.298964 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.299315 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.300000 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.300776 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.300707 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.301238 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.301493 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.301749 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.301809 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.304196 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.304558 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.304587 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.304579 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.304606 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.304673 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:20.804646954 +0000 UTC m=+21.387148276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.305346 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.306082 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.306439 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.306621 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.307073 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.307421 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.309496 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.309976 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.311250 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.311603 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.311617 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.311869 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.311980 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.312252 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.320739 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.324279 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.331056 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.333272 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.333691 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.339075 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.343494 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.344432 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.345918 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b" exitCode=255 Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.345979 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b"} Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.355006 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.360545 4777 scope.go:117] "RemoveContainer" containerID="43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.360687 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.365916 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369613 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369675 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369768 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369785 4777 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369797 4777 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369813 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369830 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369843 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369854 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369867 4777 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369878 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369892 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369911 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369928 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369946 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369961 4777 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369972 4777 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369983 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.369995 4777 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370006 4777 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370018 4777 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370030 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370042 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370054 4777 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370069 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370081 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370093 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370104 4777 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370117 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370128 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370140 4777 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370151 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370164 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370175 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370189 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370202 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370215 4777 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370228 4777 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370240 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370252 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370268 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370280 4777 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370292 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370304 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370316 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370328 4777 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370340 4777 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370352 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370364 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370378 4777 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370390 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370401 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370413 4777 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370427 4777 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370439 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370451 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370466 4777 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370478 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370492 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370505 4777 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370517 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370529 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370540 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370551 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370563 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370574 4777 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370586 4777 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370596 4777 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370609 4777 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370622 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370633 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370645 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370664 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370678 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370692 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370704 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370736 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370748 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370760 4777 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370782 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370794 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370805 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370817 4777 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370829 4777 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370841 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370853 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370865 4777 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370876 4777 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370888 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370904 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370920 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370933 4777 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370945 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370957 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370975 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370987 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.370999 4777 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371011 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371034 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371047 4777 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371059 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371071 4777 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371082 4777 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371093 4777 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371106 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371118 4777 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371130 4777 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371142 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371153 4777 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371165 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371176 4777 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371189 4777 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371201 4777 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371213 4777 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371224 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371235 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371247 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371259 4777 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371271 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371282 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371297 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371308 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371319 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371330 4777 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371342 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371353 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371366 4777 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371378 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371389 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371401 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371412 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371423 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371435 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371448 4777 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371463 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371479 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371494 4777 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371510 4777 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371528 4777 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371544 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371561 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371577 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371593 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371608 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371623 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371638 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371654 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371669 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371684 4777 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371699 4777 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371736 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371752 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371771 4777 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371786 4777 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371802 4777 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371821 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371836 4777 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371854 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371870 4777 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371886 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371901 4777 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371917 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371932 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.371995 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.372013 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.374540 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.387008 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.396784 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.408069 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.418806 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.420859 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.427764 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 16 21:38:20 crc kubenswrapper[4777]: W0216 21:38:20.433212 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-fdeceb2787f4ec45e7b7c6859cbb0ac4a6df5726f434327fccf96be18a5a8ff6 WatchSource:0}: Error finding container fdeceb2787f4ec45e7b7c6859cbb0ac4a6df5726f434327fccf96be18a5a8ff6: Status 404 returned error can't find the container with id fdeceb2787f4ec45e7b7c6859cbb0ac4a6df5726f434327fccf96be18a5a8ff6 Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.434472 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 16 21:38:20 crc kubenswrapper[4777]: W0216 21:38:20.447026 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-11bcc7e2ed1045081e90b4841a2cd7cd9a788bf8389d93b0d2b7a406b42a5cce WatchSource:0}: Error finding container 11bcc7e2ed1045081e90b4841a2cd7cd9a788bf8389d93b0d2b7a406b42a5cce: Status 404 returned error can't find the container with id 11bcc7e2ed1045081e90b4841a2cd7cd9a788bf8389d93b0d2b7a406b42a5cce Feb 16 21:38:20 crc kubenswrapper[4777]: W0216 21:38:20.448403 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c050dbf04a2202a4d6e7cf94afda72e7681a1980532126231beb3d9d04ab9b47 WatchSource:0}: Error finding container c050dbf04a2202a4d6e7cf94afda72e7681a1980532126231beb3d9d04ab9b47: Status 404 returned error can't find the container with id c050dbf04a2202a4d6e7cf94afda72e7681a1980532126231beb3d9d04ab9b47 Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.557318 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.567851 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.588530 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.600525 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.614367 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.625558 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.640233 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.648027 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.649509 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.658948 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.669995 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.679745 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.689414 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.702397 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.713395 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.724146 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.875290 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.875588 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:21.87553895 +0000 UTC m=+22.458040092 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.875922 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.875953 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.875978 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:20 crc kubenswrapper[4777]: I0216 21:38:20.875998 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876134 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876167 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876180 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876179 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876233 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:21.876217829 +0000 UTC m=+22.458718931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876141 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876258 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876300 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876312 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:21.876281481 +0000 UTC m=+22.458782623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876323 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876349 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:21.876335352 +0000 UTC m=+22.458836484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:20 crc kubenswrapper[4777]: E0216 21:38:20.876391 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:21.876371193 +0000 UTC m=+22.458872325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.113984 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:22:05.249484531 +0000 UTC Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.261873 4777 csr.go:261] certificate signing request csr-pptjt is approved, waiting to be issued Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.302668 4777 csr.go:257] certificate signing request csr-pptjt is issued Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.349292 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c050dbf04a2202a4d6e7cf94afda72e7681a1980532126231beb3d9d04ab9b47"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.350582 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.350615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.350629 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"11bcc7e2ed1045081e90b4841a2cd7cd9a788bf8389d93b0d2b7a406b42a5cce"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.352286 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.352319 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fdeceb2787f4ec45e7b7c6859cbb0ac4a6df5726f434327fccf96be18a5a8ff6"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.353917 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.355158 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9"} Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.355828 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.389951 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.422664 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.446459 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.549604 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.574524 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.585146 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.609688 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.624905 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.635277 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.650809 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.663644 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.678399 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.691382 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.705730 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.941898 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.942007 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.942045 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.942081 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:21 crc kubenswrapper[4777]: I0216 21:38:21.942107 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942163 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:23.942129103 +0000 UTC m=+24.524630205 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942262 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942258 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942315 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942372 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:23.942347369 +0000 UTC m=+24.524848471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942274 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942411 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942286 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942430 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:23.94239907 +0000 UTC m=+24.524900172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942458 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942469 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942524 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:23.942510243 +0000 UTC m=+24.525011415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:21 crc kubenswrapper[4777]: E0216 21:38:21.942547 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:23.942538864 +0000 UTC m=+24.525040076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.114252 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:45:56.262001734 +0000 UTC Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.181099 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.181151 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.181215 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:22 crc kubenswrapper[4777]: E0216 21:38:22.181287 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:22 crc kubenswrapper[4777]: E0216 21:38:22.181410 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:22 crc kubenswrapper[4777]: E0216 21:38:22.181510 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.184653 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.185307 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.186424 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.187084 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.188015 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.188520 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.189090 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.189971 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.190571 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.191431 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.191909 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.192906 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.193383 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.193885 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.194687 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.195172 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.196040 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.196464 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.197062 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.197979 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.198440 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.199340 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.199754 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.200902 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.201471 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.202159 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.203154 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.203623 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.204496 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.204933 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.205711 4777 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.206209 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.207764 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.208725 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.209158 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.210666 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.211420 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.212294 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.212912 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.214186 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.214650 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.215567 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.216184 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.217090 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.217519 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.218420 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.218907 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.219956 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.220425 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.221251 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.221702 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.222580 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.223240 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.223667 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.241587 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8tx24"] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.241945 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vpf28"] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.243544 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.244634 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zq5h9"] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.245946 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.246099 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-h78cj"] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.246972 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.247530 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.248441 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.248524 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.249090 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.252255 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.252560 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.254672 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255055 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255184 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255217 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255440 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255530 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255601 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255551 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255685 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.255551 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.266238 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.279855 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.291855 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.303111 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.304142 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-16 21:33:21 +0000 UTC, rotation deadline is 2027-01-11 00:37:54.913384894 +0000 UTC Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.304218 4777 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7874h59m32.609170298s for next certificate rotation Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.313237 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.325650 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.339031 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345125 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345174 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-multus-certs\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345210 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-cnibin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345226 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-conf-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345244 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345261 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-hostroot\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345276 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a072a4b3-dbc6-4e52-8a35-2e67069603c3-hosts-file\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345296 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd6cb2a-0e80-4642-ad1e-993774971496-rootfs\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345394 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-os-release\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345454 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345501 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-system-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345531 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-cni-binary-copy\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345556 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xrdg\" (UniqueName: \"kubernetes.io/projected/0c4c6202-048b-4373-9f44-f5eb0de89993-kube-api-access-6xrdg\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345581 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hc9\" (UniqueName: \"kubernetes.io/projected/a072a4b3-dbc6-4e52-8a35-2e67069603c3-kube-api-access-t6hc9\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345605 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-bin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345673 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-kubelet\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345753 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-os-release\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345773 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqvf\" (UniqueName: \"kubernetes.io/projected/fbd6cb2a-0e80-4642-ad1e-993774971496-kube-api-access-tjqvf\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345796 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd6cb2a-0e80-4642-ad1e-993774971496-proxy-tls\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345815 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-etc-kubernetes\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345834 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-cnibin\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345895 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-socket-dir-parent\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.345939 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-netns\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346001 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-k8s-cni-cncf-io\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346025 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-multus\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346043 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zs8\" (UniqueName: \"kubernetes.io/projected/71656da7-4f33-419d-aaba-93bf9158f706-kube-api-access-t5zs8\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346057 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd6cb2a-0e80-4642-ad1e-993774971496-mcd-auth-proxy-config\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346072 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346094 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.346147 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-multus-daemon-config\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.352070 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.377051 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.392430 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.411375 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.442668 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447043 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-k8s-cni-cncf-io\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447088 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-multus\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447108 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zs8\" (UniqueName: \"kubernetes.io/projected/71656da7-4f33-419d-aaba-93bf9158f706-kube-api-access-t5zs8\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447137 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd6cb2a-0e80-4642-ad1e-993774971496-mcd-auth-proxy-config\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447159 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447180 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447209 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-multus-daemon-config\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447207 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-k8s-cni-cncf-io\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447237 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447258 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-multus-certs\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447289 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-cnibin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447310 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-multus\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447316 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-conf-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447339 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447364 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-hostroot\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447385 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a072a4b3-dbc6-4e52-8a35-2e67069603c3-hosts-file\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447409 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd6cb2a-0e80-4642-ad1e-993774971496-rootfs\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447432 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-os-release\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447663 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447698 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-system-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447734 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-cni-binary-copy\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447761 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xrdg\" (UniqueName: \"kubernetes.io/projected/0c4c6202-048b-4373-9f44-f5eb0de89993-kube-api-access-6xrdg\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447782 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-bin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447800 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-kubelet\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447820 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hc9\" (UniqueName: \"kubernetes.io/projected/a072a4b3-dbc6-4e52-8a35-2e67069603c3-kube-api-access-t6hc9\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447840 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-os-release\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447846 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-hostroot\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447862 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqvf\" (UniqueName: \"kubernetes.io/projected/fbd6cb2a-0e80-4642-ad1e-993774971496-kube-api-access-tjqvf\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447882 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd6cb2a-0e80-4642-ad1e-993774971496-proxy-tls\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447911 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-socket-dir-parent\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447933 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-netns\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447956 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-etc-kubernetes\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448002 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-cnibin\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448062 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-cnibin\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448117 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a072a4b3-dbc6-4e52-8a35-2e67069603c3-hosts-file\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448150 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fbd6cb2a-0e80-4642-ad1e-993774971496-rootfs\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448438 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbd6cb2a-0e80-4642-ad1e-993774971496-mcd-auth-proxy-config\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448476 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-os-release\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448498 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-cnibin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448553 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448603 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-multus-certs\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448743 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448766 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-system-cni-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448768 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-multus-daemon-config\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448789 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-os-release\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448834 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-socket-dir-parent\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448833 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-run-netns\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.447286 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c4c6202-048b-4373-9f44-f5eb0de89993-system-cni-dir\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448858 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-multus-conf-dir\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.448885 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-etc-kubernetes\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.449153 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-cni-bin\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.449211 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71656da7-4f33-419d-aaba-93bf9158f706-host-var-lib-kubelet\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.449338 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-binary-copy\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.449403 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71656da7-4f33-419d-aaba-93bf9158f706-cni-binary-copy\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.449635 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c4c6202-048b-4373-9f44-f5eb0de89993-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.452388 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd6cb2a-0e80-4642-ad1e-993774971496-proxy-tls\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.484639 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zs8\" (UniqueName: \"kubernetes.io/projected/71656da7-4f33-419d-aaba-93bf9158f706-kube-api-access-t5zs8\") pod \"multus-vpf28\" (UID: \"71656da7-4f33-419d-aaba-93bf9158f706\") " pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.487947 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hc9\" (UniqueName: \"kubernetes.io/projected/a072a4b3-dbc6-4e52-8a35-2e67069603c3-kube-api-access-t6hc9\") pod \"node-resolver-8tx24\" (UID: \"a072a4b3-dbc6-4e52-8a35-2e67069603c3\") " pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.489120 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.494278 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xrdg\" (UniqueName: \"kubernetes.io/projected/0c4c6202-048b-4373-9f44-f5eb0de89993-kube-api-access-6xrdg\") pod \"multus-additional-cni-plugins-zq5h9\" (UID: \"0c4c6202-048b-4373-9f44-f5eb0de89993\") " pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.498351 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqvf\" (UniqueName: \"kubernetes.io/projected/fbd6cb2a-0e80-4642-ad1e-993774971496-kube-api-access-tjqvf\") pod \"machine-config-daemon-h78cj\" (UID: \"fbd6cb2a-0e80-4642-ad1e-993774971496\") " pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.520512 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.536435 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.547754 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.559187 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.560283 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8tx24" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.569696 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" Feb 16 21:38:22 crc kubenswrapper[4777]: W0216 21:38:22.572527 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda072a4b3_dbc6_4e52_8a35_2e67069603c3.slice/crio-0bcbf4808b3f3ff0a3d472066342896be6e40b6a5b383572b4f851dca5bb99d4 WatchSource:0}: Error finding container 0bcbf4808b3f3ff0a3d472066342896be6e40b6a5b383572b4f851dca5bb99d4: Status 404 returned error can't find the container with id 0bcbf4808b3f3ff0a3d472066342896be6e40b6a5b383572b4f851dca5bb99d4 Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.575420 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.576412 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.583828 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vpf28" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.589179 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: W0216 21:38:22.592824 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd6cb2a_0e80_4642_ad1e_993774971496.slice/crio-2f66aaf9c86e694b370885bc8631b5410f95be6994d42094e81a9dab11f9702a WatchSource:0}: Error finding container 2f66aaf9c86e694b370885bc8631b5410f95be6994d42094e81a9dab11f9702a: Status 404 returned error can't find the container with id 2f66aaf9c86e694b370885bc8631b5410f95be6994d42094e81a9dab11f9702a Feb 16 21:38:22 crc kubenswrapper[4777]: W0216 21:38:22.594965 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c4c6202_048b_4373_9f44_f5eb0de89993.slice/crio-753bf3fd5650e80693b491841fa010cfafde3ffca933a6edbac1bacf791c74a2 WatchSource:0}: Error finding container 753bf3fd5650e80693b491841fa010cfafde3ffca933a6edbac1bacf791c74a2: Status 404 returned error can't find the container with id 753bf3fd5650e80693b491841fa010cfafde3ffca933a6edbac1bacf791c74a2 Feb 16 21:38:22 crc kubenswrapper[4777]: W0216 21:38:22.602149 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71656da7_4f33_419d_aaba_93bf9158f706.slice/crio-ce45be201ba40344de95e02fabf6120ba9f621b571f7a4d3e3cec877f62b6c6f WatchSource:0}: Error finding container ce45be201ba40344de95e02fabf6120ba9f621b571f7a4d3e3cec877f62b6c6f: Status 404 returned error can't find the container with id ce45be201ba40344de95e02fabf6120ba9f621b571f7a4d3e3cec877f62b6c6f Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.628622 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w27qk"] Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.637333 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.641445 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.641617 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.641640 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.641773 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.641899 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.642221 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.642251 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650296 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650447 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650553 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650659 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650805 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.650901 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hhz\" (UniqueName: \"kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651116 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651218 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651312 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651416 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651508 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651750 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651831 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651896 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.651947 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.652005 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.652046 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.652079 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.652114 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.652146 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.656164 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.681804 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.697398 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.716557 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.742696 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753444 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753487 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753513 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753552 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753582 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753623 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753674 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753649 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753747 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753775 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753809 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.753824 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754319 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754359 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754383 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754410 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754440 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754525 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754544 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754588 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754472 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754491 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754624 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754559 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hhz\" (UniqueName: \"kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754694 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754731 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754765 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754782 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754799 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754815 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754828 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754835 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754863 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754889 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754797 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754932 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.754639 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.755333 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.757561 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.760549 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.777284 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.778266 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hhz\" (UniqueName: \"kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz\") pod \"ovnkube-node-w27qk\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.803247 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.816575 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.830080 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.846577 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.869764 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:22 crc kubenswrapper[4777]: I0216 21:38:22.982576 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:22 crc kubenswrapper[4777]: W0216 21:38:22.995033 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c293d7_2d38_4047_a104_7f354aebf216.slice/crio-7c449b5c07ecbf291af601994f4937b3f67315035262dbb45b66b3e7bd00103f WatchSource:0}: Error finding container 7c449b5c07ecbf291af601994f4937b3f67315035262dbb45b66b3e7bd00103f: Status 404 returned error can't find the container with id 7c449b5c07ecbf291af601994f4937b3f67315035262dbb45b66b3e7bd00103f Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.114477 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 19:11:52.096128115 +0000 UTC Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.360615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.362810 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3" exitCode=0 Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.362884 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.362918 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"7c449b5c07ecbf291af601994f4937b3f67315035262dbb45b66b3e7bd00103f"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.364778 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4" exitCode=0 Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.364833 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.364876 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerStarted","Data":"753bf3fd5650e80693b491841fa010cfafde3ffca933a6edbac1bacf791c74a2"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.366949 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8tx24" event={"ID":"a072a4b3-dbc6-4e52-8a35-2e67069603c3","Type":"ContainerStarted","Data":"899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.367001 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8tx24" event={"ID":"a072a4b3-dbc6-4e52-8a35-2e67069603c3","Type":"ContainerStarted","Data":"0bcbf4808b3f3ff0a3d472066342896be6e40b6a5b383572b4f851dca5bb99d4"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.369231 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerStarted","Data":"9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.369307 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerStarted","Data":"ce45be201ba40344de95e02fabf6120ba9f621b571f7a4d3e3cec877f62b6c6f"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.371548 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.371578 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.371589 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"2f66aaf9c86e694b370885bc8631b5410f95be6994d42094e81a9dab11f9702a"} Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.376679 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.393543 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.408271 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.422175 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.447932 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.460001 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.471660 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.483042 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.493453 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.504214 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.516701 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.537737 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.556584 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.571325 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.595568 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.608822 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.624917 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.641423 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.654884 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.668697 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.679897 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.690034 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.702835 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.720000 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:23Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.965869 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966113 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:27.966079483 +0000 UTC m=+28.548580585 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.966304 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.966369 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.966406 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:23 crc kubenswrapper[4777]: I0216 21:38:23.966438 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966583 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966669 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966702 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:27.966675639 +0000 UTC m=+28.549176751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966594 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966947 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966960 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:27.966914695 +0000 UTC m=+28.549415857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966977 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.966594 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.967032 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.967052 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:27.967029659 +0000 UTC m=+28.549530771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.967064 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:23 crc kubenswrapper[4777]: E0216 21:38:23.967157 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:27.967136852 +0000 UTC m=+28.549638144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.115569 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 04:16:31.673005499 +0000 UTC Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.172757 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-q2ff8"] Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.173361 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.175443 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.175444 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.175936 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.176473 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.180832 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.180861 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.180933 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:24 crc kubenswrapper[4777]: E0216 21:38:24.180996 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:24 crc kubenswrapper[4777]: E0216 21:38:24.181117 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:24 crc kubenswrapper[4777]: E0216 21:38:24.181213 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.193672 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.214124 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.232347 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.245678 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.261634 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.269203 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-host\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.269287 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfpk\" (UniqueName: \"kubernetes.io/projected/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-kube-api-access-4rfpk\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.269360 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-serviceca\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.278378 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.307005 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.325026 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.338157 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.350750 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.363996 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.370702 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfpk\" (UniqueName: \"kubernetes.io/projected/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-kube-api-access-4rfpk\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.370803 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-serviceca\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.370845 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-host\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.370904 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-host\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.372042 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-serviceca\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.378584 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.378619 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.378627 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.378637 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.378645 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.381123 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.381421 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f" exitCode=0 Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.381468 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f"} Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.398486 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.401395 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfpk\" (UniqueName: \"kubernetes.io/projected/7818180b-37e3-4bbb-bdd6-7ac570c5ea2c-kube-api-access-4rfpk\") pod \"node-ca-q2ff8\" (UID: \"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\") " pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.415807 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.430438 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.447704 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.462477 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.475879 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.488121 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.499572 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.508600 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q2ff8" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.515571 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.528388 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.551281 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.558347 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.562940 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.566184 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.567426 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.584841 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.598375 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.612773 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.629541 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.643971 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.653816 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.669037 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.718159 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.736844 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.751016 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.763181 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.774706 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.787414 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.800438 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.824439 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:24 crc kubenswrapper[4777]: I0216 21:38:24.872596 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:24Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.116758 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 07:06:57.181358234 +0000 UTC Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.394802 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa"} Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.397038 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q2ff8" event={"ID":"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c","Type":"ContainerStarted","Data":"9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8"} Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.397107 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q2ff8" event={"ID":"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c","Type":"ContainerStarted","Data":"f1580e37cec19dfccf1a0bb0263356450cce8a4ddaceddbb463a164979b9dca2"} Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.400177 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54" exitCode=0 Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.400338 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54"} Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.414312 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.438997 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.459106 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.478275 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.493371 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.513300 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.527090 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.542079 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.556304 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.574561 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.586934 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.601406 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.612670 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.623531 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.638854 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.650784 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.665586 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.679942 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.691180 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.704154 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.728798 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.746445 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.786777 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.826334 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.863890 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.904969 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.954236 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:25 crc kubenswrapper[4777]: I0216 21:38:25.989590 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:25Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.057596 4777 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.059616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.059658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.059671 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.059790 4777 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.068011 4777 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.068324 4777 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.069956 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.070038 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.070058 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.070085 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.070106 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.101215 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.107397 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.107461 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.107478 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.107507 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.107527 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.117133 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 00:07:06.24034527 +0000 UTC Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.132736 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.137539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.137577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.137588 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.137607 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.137622 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.154086 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.158906 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.158978 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.158999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.159028 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.159049 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.179412 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.180701 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.180701 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.180817 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.180914 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.180863 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.181108 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.182849 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.182884 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.182894 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.182910 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.182920 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.201141 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: E0216 21:38:26.201266 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.203647 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.203737 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.203754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.203775 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.203790 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.306390 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.306431 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.306446 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.306461 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.306471 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.411958 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.412007 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.412028 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.412054 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.412072 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.415594 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53" exitCode=0 Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.415671 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.443409 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.462973 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.488915 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.505360 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.516003 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.516078 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.516103 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.516131 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.516176 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.525190 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.544511 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.564865 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.596919 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.622230 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.622276 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.622290 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.622311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.622323 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.634139 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.668900 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.683672 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.698634 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.714026 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.724570 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.724616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.724626 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.724643 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.724653 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.736185 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:26Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.827871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.827933 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.827949 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.827969 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.827984 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.930601 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.930654 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.930672 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.930699 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:26 crc kubenswrapper[4777]: I0216 21:38:26.930742 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:26Z","lastTransitionTime":"2026-02-16T21:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.033407 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.033477 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.033505 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.033539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.033564 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.118328 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 22:56:08.388737757 +0000 UTC Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.136612 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.136671 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.136686 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.136732 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.136756 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.239759 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.239797 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.239805 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.239820 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.239831 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.342339 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.342376 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.342387 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.342404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.342414 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.430422 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2" exitCode=0 Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.430489 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.440571 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.448171 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.448219 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.448229 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.448247 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.448259 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.452102 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.475613 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.490921 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.504611 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.523005 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.543538 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.550982 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.551011 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.551023 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.551041 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.551056 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.557013 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.568128 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.579510 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.597164 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.609430 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.626135 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.636807 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.649282 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:27Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.653731 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.653781 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.653794 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.653816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.653829 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.756272 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.756313 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.756324 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.756342 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.756354 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.859643 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.859705 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.859782 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.859813 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.859832 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.963352 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.963432 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.963455 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.963488 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:27 crc kubenswrapper[4777]: I0216 21:38:27.963514 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:27Z","lastTransitionTime":"2026-02-16T21:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.011892 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.012069 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012128 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.012079211 +0000 UTC m=+36.594580373 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.012231 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012259 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.012318 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012349 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.012323208 +0000 UTC m=+36.594824340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.012373 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012514 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012580 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012609 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012514 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012691 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.012663727 +0000 UTC m=+36.595164909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012533 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012784 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012807 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012786 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.0127575 +0000 UTC m=+36.595258642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.012891 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.012870993 +0000 UTC m=+36.595372205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.067074 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.067174 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.067197 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.067232 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.067254 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.119443 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 12:53:20.700785018 +0000 UTC Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.171373 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.171453 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.171479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.171513 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.171537 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.181153 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.181185 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.181233 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.181358 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.181476 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:28 crc kubenswrapper[4777]: E0216 21:38:28.181647 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.274931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.274989 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.275008 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.275033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.275051 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.378147 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.378221 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.378246 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.378285 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.378310 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.451365 4777 generic.go:334] "Generic (PLEG): container finished" podID="0c4c6202-048b-4373-9f44-f5eb0de89993" containerID="127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86" exitCode=0 Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.451434 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerDied","Data":"127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.477580 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.481106 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.481149 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.481168 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.481235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.481262 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.504906 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.527309 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.541744 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.560280 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.584046 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.585095 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.585153 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.585165 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.585186 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.585203 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.602574 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.616781 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.631983 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.650236 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.664372 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.679564 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.689283 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.689364 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.689377 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.689395 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.689407 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.693060 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.715746 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:28Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.792493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.792871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.792891 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.792917 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.792938 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.897092 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.897158 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.897176 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.897203 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:28 crc kubenswrapper[4777]: I0216 21:38:28.897224 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:28Z","lastTransitionTime":"2026-02-16T21:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.000643 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.000765 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.000793 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.000831 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.000854 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.104528 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.104611 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.104631 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.104662 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.104682 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.120161 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 22:58:39.252138053 +0000 UTC Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.208349 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.208407 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.208419 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.208438 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.208454 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.311316 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.311371 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.311384 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.311402 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.311414 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.414657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.414754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.414775 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.414802 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.414820 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.462082 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.462668 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.462874 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.466954 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" event={"ID":"0c4c6202-048b-4373-9f44-f5eb0de89993","Type":"ContainerStarted","Data":"5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.484868 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.498596 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.500817 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.508413 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.518166 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.518238 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.518257 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.518285 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.518304 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.534417 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.553526 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.570525 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.591254 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.610469 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.628255 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.642333 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.658345 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.683473 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.702267 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.723624 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.889996 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.890069 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.890093 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.890123 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.890146 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.905789 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.923171 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.940899 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.951678 4777 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.952599 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ovn-kubernetes/pods/ovnkube-node-w27qk/status\": read tcp 38.102.83.203:33218->38.102.83.203:6443: use of closed network connection" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.987892 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:29Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.992566 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.992613 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.992625 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.992644 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:29 crc kubenswrapper[4777]: I0216 21:38:29.992659 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:29Z","lastTransitionTime":"2026-02-16T21:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.003358 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.015246 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.029754 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.048080 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.073600 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.095666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.095724 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.095734 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.095755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.095767 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.107078 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.120852 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:53:03.694001053 +0000 UTC Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.121949 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.133434 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.146449 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.158769 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.180733 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.180836 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:30 crc kubenswrapper[4777]: E0216 21:38:30.180885 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:30 crc kubenswrapper[4777]: E0216 21:38:30.180987 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.181149 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:30 crc kubenswrapper[4777]: E0216 21:38:30.181365 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.194035 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.198380 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.198506 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.198528 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.198561 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.198586 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.207252 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.232401 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.248465 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.263657 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.282026 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.299662 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.303778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.303827 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.303842 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.303866 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.303881 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.319467 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.336328 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.353697 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.368693 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.388252 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.406089 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.407368 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.407443 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.407514 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.407543 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.407646 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.425783 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:30Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.470795 4777 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.510232 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.510327 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.510344 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.510368 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.510382 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.612563 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.612602 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.612610 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.612625 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.612636 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.715972 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.716306 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.716321 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.716348 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.716361 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.819568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.819637 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.819656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.819684 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.819703 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.922855 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.922889 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.922897 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.922912 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:30 crc kubenswrapper[4777]: I0216 21:38:30.922923 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:30Z","lastTransitionTime":"2026-02-16T21:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.025708 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.025868 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.025886 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.025916 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.025935 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.121988 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:27:11.520910918 +0000 UTC Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.129165 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.129188 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.129196 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.129212 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.129224 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.232197 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.232227 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.232235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.232251 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.232261 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.335315 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.335364 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.335377 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.335394 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.335406 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.439021 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.439092 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.439120 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.439153 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.439210 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.478449 4777 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.542786 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.542907 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.542925 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.542951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.542967 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.645655 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.645705 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.645736 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.645760 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.645777 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.748927 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.748975 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.748986 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.749009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.749026 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.851163 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.851212 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.851221 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.851241 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.851253 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.954271 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.954337 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.954358 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.954387 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:31 crc kubenswrapper[4777]: I0216 21:38:31.954406 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:31Z","lastTransitionTime":"2026-02-16T21:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.057658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.057905 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.057982 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.058079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.058183 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.123007 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 05:59:29.794393728 +0000 UTC Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.162561 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.162597 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.162606 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.162621 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.162631 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.181359 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.181406 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.181398 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:32 crc kubenswrapper[4777]: E0216 21:38:32.181504 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:32 crc kubenswrapper[4777]: E0216 21:38:32.181567 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:32 crc kubenswrapper[4777]: E0216 21:38:32.181687 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.265580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.265656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.265679 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.265711 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.265772 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.368754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.368822 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.368840 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.368874 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.368893 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.472741 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.472808 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.472826 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.472851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.472870 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.576556 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.576632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.576657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.576690 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.576745 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.680336 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.680430 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.680448 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.680490 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.680509 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.783192 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.783266 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.783283 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.783307 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.783325 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.886679 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.886760 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.886774 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.886792 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.886819 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.989526 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.989605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.989624 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.989650 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:32 crc kubenswrapper[4777]: I0216 21:38:32.989668 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:32Z","lastTransitionTime":"2026-02-16T21:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.093697 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.093835 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.093859 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.093893 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.093921 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.123692 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 06:50:01.742678055 +0000 UTC Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.197789 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.197851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.197869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.197893 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.197913 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.301805 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.301883 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.301906 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.301934 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.301957 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.405332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.405406 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.405423 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.405449 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.405469 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.489969 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/0.log" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.494373 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625" exitCode=1 Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.494433 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.495524 4777 scope.go:117] "RemoveContainer" containerID="bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.509129 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.509183 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.509204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.509226 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.509246 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.515567 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.544773 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.562976 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.581868 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.603053 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.612580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.612636 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.612655 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.612760 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.612816 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.628119 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.649361 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.669977 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.689936 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.716259 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.716318 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.716332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.716354 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.716371 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.724645 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.744941 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.763999 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.792207 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.809340 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:33Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.819269 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.819312 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.819323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.819362 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.819373 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.924375 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.924429 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.924442 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.924462 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:33 crc kubenswrapper[4777]: I0216 21:38:33.924475 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:33Z","lastTransitionTime":"2026-02-16T21:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.030260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.030314 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.030329 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.030351 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.030397 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.124635 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 00:53:26.628229049 +0000 UTC Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.133156 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.133213 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.133232 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.133259 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.133292 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.181065 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.181119 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:34 crc kubenswrapper[4777]: E0216 21:38:34.181241 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.181316 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:34 crc kubenswrapper[4777]: E0216 21:38:34.181445 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:34 crc kubenswrapper[4777]: E0216 21:38:34.181529 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.250226 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.250288 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.250301 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.250323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.250336 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.354101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.354155 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.354167 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.354192 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.354208 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.414850 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.426113 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.437272 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.452176 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.456506 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.456547 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.456555 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.456576 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.456588 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.472192 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.485422 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.499103 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.501966 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/0.log" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.507488 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.507677 4777 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.519133 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.537582 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.557071 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.559551 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.559604 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.559626 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.559650 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.559672 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.585034 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.601572 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.617325 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.642445 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.660854 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.662521 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.662603 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.662624 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.662651 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.662673 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.678397 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.681930 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.695047 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.720363 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.734187 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.764971 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.766577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.766655 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.766681 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.766740 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.766766 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.778265 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx"] Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.778789 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.782255 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.782589 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.790357 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.808650 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.823407 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.840683 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.849999 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.850219 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjhd\" (UniqueName: \"kubernetes.io/projected/7b6c3c71-4486-4eae-8aae-424892a1d703-kube-api-access-fjjhd\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.850416 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b6c3c71-4486-4eae-8aae-424892a1d703-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.850603 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-env-overrides\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.861837 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.870657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.870743 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.870757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.870774 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.870787 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.881449 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.905386 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.926779 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.951585 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjhd\" (UniqueName: \"kubernetes.io/projected/7b6c3c71-4486-4eae-8aae-424892a1d703-kube-api-access-fjjhd\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.951692 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b6c3c71-4486-4eae-8aae-424892a1d703-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.951766 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-env-overrides\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.951853 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.953099 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.953830 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b6c3c71-4486-4eae-8aae-424892a1d703-env-overrides\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.956140 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.960659 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b6c3c71-4486-4eae-8aae-424892a1d703-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.973943 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.973992 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.974008 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.974033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.974051 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:34Z","lastTransitionTime":"2026-02-16T21:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.974405 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjhd\" (UniqueName: \"kubernetes.io/projected/7b6c3c71-4486-4eae-8aae-424892a1d703-kube-api-access-fjjhd\") pod \"ovnkube-control-plane-749d76644c-npnnx\" (UID: \"7b6c3c71-4486-4eae-8aae-424892a1d703\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:34 crc kubenswrapper[4777]: I0216 21:38:34.978870 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:34Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.008662 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.022449 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.039215 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.057219 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.077426 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.077488 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.077504 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.077534 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.077556 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.086710 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.103187 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.108949 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.124840 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 13:37:42.373819056 +0000 UTC Feb 16 21:38:35 crc kubenswrapper[4777]: W0216 21:38:35.125222 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b6c3c71_4486_4eae_8aae_424892a1d703.slice/crio-0647107f59bd37abda6f1d9b4980a52ee54465d492d03d3435d281990d30dc55 WatchSource:0}: Error finding container 0647107f59bd37abda6f1d9b4980a52ee54465d492d03d3435d281990d30dc55: Status 404 returned error can't find the container with id 0647107f59bd37abda6f1d9b4980a52ee54465d492d03d3435d281990d30dc55 Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.138466 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.163347 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.182214 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.182287 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.182311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.182347 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.182374 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.194411 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.223768 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.248293 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.267997 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.286873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.286950 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.286964 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.286994 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.287011 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.291564 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.306746 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.390585 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.390628 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.390873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.390996 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.391012 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.494411 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.494628 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.494651 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.494674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.494691 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.514952 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" event={"ID":"7b6c3c71-4486-4eae-8aae-424892a1d703","Type":"ContainerStarted","Data":"338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.515045 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" event={"ID":"7b6c3c71-4486-4eae-8aae-424892a1d703","Type":"ContainerStarted","Data":"24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.515072 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" event={"ID":"7b6c3c71-4486-4eae-8aae-424892a1d703","Type":"ContainerStarted","Data":"0647107f59bd37abda6f1d9b4980a52ee54465d492d03d3435d281990d30dc55"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.520857 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/1.log" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.521775 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/0.log" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.526257 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624" exitCode=1 Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.526321 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.526451 4777 scope.go:117] "RemoveContainer" containerID="bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.528011 4777 scope.go:117] "RemoveContainer" containerID="ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624" Feb 16 21:38:35 crc kubenswrapper[4777]: E0216 21:38:35.528434 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.537087 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.553090 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.574069 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.592159 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.597679 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.597754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.597772 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.597801 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.597820 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.617620 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.652113 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.678694 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.695247 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.700060 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.700090 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.700098 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.700116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.700129 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.709647 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.721885 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.735416 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.754981 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.772571 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.791841 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.803613 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.803662 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.803676 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.803697 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.803728 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.809801 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.828244 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.848901 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.872762 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.892255 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.907509 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.907571 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.907592 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.907620 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.907642 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:35Z","lastTransitionTime":"2026-02-16T21:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.909332 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.928180 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.944491 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.957878 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.974043 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:35 crc kubenswrapper[4777]: I0216 21:38:35.989927 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:35Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.010999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.011046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.011058 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.011079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.011096 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.015586 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.037596 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.060298 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.065617 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.065796 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.065922 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.065885992 +0000 UTC m=+52.648387134 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066009 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.066029 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.066084 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066039 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066131 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.066174 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066092 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066304 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.06617758 +0000 UTC m=+52.648678732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066334 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.066320814 +0000 UTC m=+52.648821956 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066411 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066426 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066480 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066510 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066515 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.066492669 +0000 UTC m=+52.648993871 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.066640 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.066602312 +0000 UTC m=+52.649103494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.087407 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.105072 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.114785 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.114857 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.114875 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.114908 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.114927 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.125294 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 06:57:00.953870629 +0000 UTC Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.181357 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.181480 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.181545 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.181772 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.181931 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.182144 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.219900 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.219961 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.219980 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.220009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.220033 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.323419 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.323508 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.323531 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.323566 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.323613 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.327410 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rwm84"] Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.328575 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.328754 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.350295 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.369893 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.370039 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dpw\" (UniqueName: \"kubernetes.io/projected/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-kube-api-access-m9dpw\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.370981 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.392432 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.392513 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.392534 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.392564 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.392587 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.398227 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.424566 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.439211 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.443029 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.443098 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.443118 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.443148 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.443169 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.468037 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.471376 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.471482 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dpw\" (UniqueName: \"kubernetes.io/projected/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-kube-api-access-m9dpw\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.471586 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.471689 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:36.971661903 +0000 UTC m=+37.554163015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.475673 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.475744 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.475757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.475784 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.475798 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.485113 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.496118 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500223 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500237 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9dpw\" (UniqueName: \"kubernetes.io/projected/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-kube-api-access-m9dpw\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500251 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500328 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500354 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.500369 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.501632 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.515333 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.516097 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.519646 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.519742 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.519758 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.519778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.519792 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.528198 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.530976 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/1.log" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.535453 4777 scope.go:117] "RemoveContainer" containerID="ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.535633 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.535588 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.535792 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.536860 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.536890 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.536899 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.536913 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.536925 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.542753 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.554173 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.564982 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.579833 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.598400 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.613727 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.640595 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.640653 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.640670 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.640692 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.640706 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.647891 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4d03f86577a9b5d4434b29b6a6f560782c82f0908ff567dbcd9d8feb02d625\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:32Z\\\",\\\"message\\\":\\\"oval\\\\nI0216 21:38:32.284401 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:32.284414 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:32.284448 6072 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:32.284487 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0216 21:38:32.284495 6072 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0216 21:38:32.284542 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:32.285061 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:38:32.285083 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:38:32.285095 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:32.285107 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:32.285119 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:32.285139 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:32.285159 6072 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:32.285251 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:32.285386 6072 factory.go:656] Stopping watch factory\\\\nI0216 21:38:32.285449 6072 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.665237 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.685610 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.705905 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.738190 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.743778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.743865 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.743889 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.743924 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.743949 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.757180 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.780223 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.798772 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.826134 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.843695 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.847408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.847615 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.847900 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.848122 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.848312 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.867138 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.890587 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.911417 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.928153 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.948060 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.951798 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.951849 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.951868 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.951895 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.951912 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:36Z","lastTransitionTime":"2026-02-16T21:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.966270 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.977126 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.977366 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: E0216 21:38:36.977535 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:37.977490986 +0000 UTC m=+38.559992159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:36 crc kubenswrapper[4777]: I0216 21:38:36.992593 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:36Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.012470 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:37Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.055888 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.055949 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.055967 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.055994 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.056014 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.126251 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 09:37:01.984157672 +0000 UTC Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.159422 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.159503 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.159529 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.159562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.159585 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.262846 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.262923 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.262947 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.262977 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.262996 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.366957 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.367030 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.367051 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.367079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.367098 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.470492 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.470560 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.470580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.470608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.470626 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.574235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.574313 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.574332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.574356 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.574375 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.677839 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.677895 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.677914 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.677939 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.677958 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.781177 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.781230 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.781248 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.781273 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.781291 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.883970 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.884010 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.884019 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.884034 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.884046 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.988411 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:37 crc kubenswrapper[4777]: E0216 21:38:37.988685 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:37 crc kubenswrapper[4777]: E0216 21:38:37.989102 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:39.9890706 +0000 UTC m=+40.571571732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.988771 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.989371 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.989526 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.989751 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:37 crc kubenswrapper[4777]: I0216 21:38:37.989959 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:37Z","lastTransitionTime":"2026-02-16T21:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.093146 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.093206 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.093225 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.093253 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.093272 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.127570 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 18:33:25.118071671 +0000 UTC Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.181656 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.181820 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.181884 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:38 crc kubenswrapper[4777]: E0216 21:38:38.181885 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.181907 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:38 crc kubenswrapper[4777]: E0216 21:38:38.182043 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:38 crc kubenswrapper[4777]: E0216 21:38:38.182238 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:38 crc kubenswrapper[4777]: E0216 21:38:38.182541 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.196264 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.196311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.196329 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.196350 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.196368 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.300515 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.300621 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.300643 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.300676 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.300699 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.403351 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.403426 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.403445 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.403472 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.403514 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.507489 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.507547 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.507559 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.507580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.507594 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.610838 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.610917 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.610937 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.610967 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.610988 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.713971 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.714035 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.714059 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.714097 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.714117 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.817943 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.818019 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.818042 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.818069 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.818086 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.921180 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.921242 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.921260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.921292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:38 crc kubenswrapper[4777]: I0216 21:38:38.921309 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:38Z","lastTransitionTime":"2026-02-16T21:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.025144 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.025208 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.025251 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.025278 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.025295 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.127671 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 16:51:11.928125891 +0000 UTC Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.128404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.128467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.128485 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.128510 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.128529 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.231367 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.231441 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.231467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.231497 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.231521 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.335783 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.336151 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.336299 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.336459 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.336617 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.440490 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.440554 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.440577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.440608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.440629 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.544101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.544173 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.544192 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.544224 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.544243 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.647798 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.647869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.647889 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.647918 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.647935 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.751330 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.751386 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.751401 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.751422 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.751435 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.854542 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.854607 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.854624 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.854651 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.854669 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.957566 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.957631 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.957649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.957674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:39 crc kubenswrapper[4777]: I0216 21:38:39.957692 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:39Z","lastTransitionTime":"2026-02-16T21:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.012861 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.013195 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.013339 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:44.013304399 +0000 UTC m=+44.595805531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.061562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.061619 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.061636 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.061661 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.061680 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.129146 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:06:22.400253 +0000 UTC Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.165180 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.165268 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.165301 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.165332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.165355 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.181442 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.181536 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.181635 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.181650 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.181674 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.182002 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.182084 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:40 crc kubenswrapper[4777]: E0216 21:38:40.182274 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.202362 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.220930 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.238997 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.258877 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.268766 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.268879 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.268901 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.268962 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.268983 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.281398 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.317145 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.333587 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.353785 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.376434 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.376504 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.376544 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.376578 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.376607 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.382620 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.412642 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.431271 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.453340 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.471389 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.480198 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.480269 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.480287 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.480311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.480328 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.489493 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.507390 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.527433 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:40Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.584035 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.584119 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.584138 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.584169 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.584188 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.687249 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.687327 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.687353 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.687384 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.687405 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.790378 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.790451 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.790470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.790496 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.790516 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.894049 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.894115 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.894133 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.894159 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.894179 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.997140 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.997226 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.997254 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.997287 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:40 crc kubenswrapper[4777]: I0216 21:38:40.997311 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:40Z","lastTransitionTime":"2026-02-16T21:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.100330 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.100398 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.100414 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.100440 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.100457 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.130026 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 08:21:42.306526206 +0000 UTC Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.204046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.204136 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.204161 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.204190 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.204210 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.309365 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.309444 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.309469 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.309501 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.309521 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.412518 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.412603 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.412622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.412659 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.412677 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.517123 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.517223 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.517251 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.517286 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.517310 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.621829 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.621869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.621879 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.621896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.621906 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.725579 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.725635 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.725656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.725682 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.725701 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.829709 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.829836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.829860 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.829890 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.829915 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.933491 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.933539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.933556 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.933579 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:41 crc kubenswrapper[4777]: I0216 21:38:41.933599 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:41Z","lastTransitionTime":"2026-02-16T21:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.036315 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.036410 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.036423 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.036446 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.036465 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.130633 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:46:19.21711264 +0000 UTC Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.146766 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.146838 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.146862 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.146894 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.146921 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.181032 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.181032 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.181394 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:42 crc kubenswrapper[4777]: E0216 21:38:42.181408 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:42 crc kubenswrapper[4777]: E0216 21:38:42.181698 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:42 crc kubenswrapper[4777]: E0216 21:38:42.181804 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.181855 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:42 crc kubenswrapper[4777]: E0216 21:38:42.182065 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.250149 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.250218 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.250238 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.250266 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.250285 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.353926 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.353981 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.353998 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.354026 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.354044 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.456935 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.457009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.457027 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.457053 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.457074 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.559783 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.559846 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.559863 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.559888 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.559906 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.663558 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.663640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.663657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.663683 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.663703 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.767317 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.767392 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.767410 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.767438 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.767457 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.871494 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.871572 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.871590 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.871621 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.871642 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.975116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.975177 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.975198 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.975225 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:42 crc kubenswrapper[4777]: I0216 21:38:42.975247 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:42Z","lastTransitionTime":"2026-02-16T21:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.079292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.079410 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.079434 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.079479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.079499 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.131774 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:54:47.051619676 +0000 UTC Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.183166 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.183235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.183258 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.183286 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.183306 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.286986 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.287056 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.287073 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.287101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.287120 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.390657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.390764 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.390788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.390815 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.390832 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.494200 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.494284 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.494306 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.494336 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.494358 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.597683 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.597790 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.597809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.597845 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.597870 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.700405 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.700456 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.700468 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.700488 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.700501 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.804245 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.804327 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.804344 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.804373 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.804392 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.907809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.907877 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.907904 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.907937 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:43 crc kubenswrapper[4777]: I0216 21:38:43.907957 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:43Z","lastTransitionTime":"2026-02-16T21:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.011514 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.011590 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.011608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.011633 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.011651 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.065386 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.065651 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.065790 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:38:52.065766475 +0000 UTC m=+52.648267577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.114775 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.114847 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.114866 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.114893 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.114915 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.132387 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 08:13:24.42830634 +0000 UTC Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.181227 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.181294 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.181301 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.181321 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.181558 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.181764 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.182040 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:44 crc kubenswrapper[4777]: E0216 21:38:44.182135 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.218514 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.218574 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.218590 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.218614 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.218635 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.322186 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.322258 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.322275 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.322305 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.322326 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.426404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.426518 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.426544 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.426582 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.426610 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.530539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.530624 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.530644 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.530674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.530693 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.634836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.634935 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.634956 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.634993 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.635018 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.738647 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.738711 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.738763 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.738788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.738807 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.842417 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.842508 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.842531 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.842571 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.842596 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.946564 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.946619 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.946633 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.946656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:44 crc kubenswrapper[4777]: I0216 21:38:44.946671 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:44Z","lastTransitionTime":"2026-02-16T21:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.050322 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.050404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.050425 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.050455 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.050478 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.133006 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 03:13:15.429351615 +0000 UTC Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.154522 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.154688 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.154743 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.154778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.154800 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.258707 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.258819 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.258839 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.258873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.258893 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.362341 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.362444 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.362470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.362506 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.362533 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.465837 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.465907 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.465926 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.465955 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.465977 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.569025 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.569069 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.569078 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.569093 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.569103 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.671649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.671738 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.671753 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.671786 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.671806 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.775436 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.775496 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.775518 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.775545 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.775565 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.878397 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.878439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.878448 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.878464 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.878476 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.980517 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.980610 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.980622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.980636 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:45 crc kubenswrapper[4777]: I0216 21:38:45.980646 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:45Z","lastTransitionTime":"2026-02-16T21:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.084025 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.084065 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.084077 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.084096 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.084109 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.141224 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 00:46:36.893274706 +0000 UTC Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.183618 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.183805 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.183810 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.183870 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.183930 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.183958 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.184014 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.184077 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.187456 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.187487 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.187498 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.187515 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.187529 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.290896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.290959 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.290975 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.291010 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.291022 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.393761 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.393839 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.393851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.393873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.393902 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.496359 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.496408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.496421 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.496439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.496453 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.602462 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.602527 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.602550 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.602575 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.602591 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.699258 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.699319 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.699342 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.699367 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.699386 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.722497 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:46Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.727697 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.727745 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.727754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.727769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.727779 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.745988 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:46Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.750465 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.750502 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.750511 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.750530 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.750542 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.765692 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:46Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.770755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.770800 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.770812 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.770850 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.770863 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.784385 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:46Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.788356 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.788387 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.788398 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.788413 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.788426 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.806681 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:46Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:46 crc kubenswrapper[4777]: E0216 21:38:46.806815 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.809616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.809668 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.809682 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.809704 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.809739 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.914029 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.914077 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.914090 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.914112 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:46 crc kubenswrapper[4777]: I0216 21:38:46.914127 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:46Z","lastTransitionTime":"2026-02-16T21:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.130612 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.130698 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.130781 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.130816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.130838 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.141904 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:36:49.572011656 +0000 UTC Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.234649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.234706 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.234779 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.234807 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.234826 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.337394 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.337448 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.337465 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.337489 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.337508 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.440601 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.440674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.440696 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.440757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.440809 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.543950 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.544009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.544019 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.544038 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.544052 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.647811 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.647873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.647886 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.647905 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.647917 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.750869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.750908 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.750919 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.750939 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.750950 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.853976 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.854047 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.854070 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.854107 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.854130 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.957983 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.958056 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.958080 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.958120 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:47 crc kubenswrapper[4777]: I0216 21:38:47.958150 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:47Z","lastTransitionTime":"2026-02-16T21:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.061863 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.061933 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.061951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.061979 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.061997 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.142761 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:29:00.19472398 +0000 UTC Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.165549 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.165653 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.165679 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.165770 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.165801 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.181309 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.181305 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.181398 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.181418 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:48 crc kubenswrapper[4777]: E0216 21:38:48.181516 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:48 crc kubenswrapper[4777]: E0216 21:38:48.181587 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:48 crc kubenswrapper[4777]: E0216 21:38:48.181753 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:48 crc kubenswrapper[4777]: E0216 21:38:48.181849 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.269179 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.269245 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.269260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.269282 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.269297 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.372674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.372749 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.372762 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.372787 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.372801 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.475977 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.476023 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.476034 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.476052 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.476064 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.579068 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.579124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.579145 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.579171 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.579189 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.681909 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.681964 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.681978 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.681999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.682011 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.785828 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.785915 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.785938 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.785971 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.785992 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.889459 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.889502 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.889511 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.889527 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.889538 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.992194 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.992249 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.992269 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.992295 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:48 crc kubenswrapper[4777]: I0216 21:38:48.992312 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:48Z","lastTransitionTime":"2026-02-16T21:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.095303 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.095338 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.095352 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.095370 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.095382 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.143050 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:20:41.162206736 +0000 UTC Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.198580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.198625 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.198636 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.198653 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.198667 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.301686 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.301788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.301809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.301835 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.301853 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.405702 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.405871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.405896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.405932 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.405958 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.510282 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.510374 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.510392 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.510418 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.510437 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.613054 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.613130 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.613144 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.613163 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.613176 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.716938 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.717002 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.717019 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.717049 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.717067 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.819866 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.819920 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.819936 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.819960 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.819975 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.924150 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.924195 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.924207 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.924226 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:49 crc kubenswrapper[4777]: I0216 21:38:49.924240 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:49Z","lastTransitionTime":"2026-02-16T21:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.027592 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.027632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.027641 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.027658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.027669 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.131110 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.131183 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.131204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.131229 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.131247 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.143462 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 17:36:10.231633892 +0000 UTC Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.182147 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.182234 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.182413 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:50 crc kubenswrapper[4777]: E0216 21:38:50.182501 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.182687 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:50 crc kubenswrapper[4777]: E0216 21:38:50.182758 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:50 crc kubenswrapper[4777]: E0216 21:38:50.183041 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:50 crc kubenswrapper[4777]: E0216 21:38:50.183090 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.204173 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.222383 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.234067 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.234099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.234108 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.234124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.234134 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.254816 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.270075 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.291832 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.311140 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.334238 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.339157 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.339370 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.339542 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.339769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.339983 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.349410 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.371773 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.394833 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.414773 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.430606 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.443123 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.443185 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.443198 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.443219 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.443234 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.448275 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.465547 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.484627 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.501045 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:50Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.546397 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.546448 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.546463 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.546483 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.546498 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.649516 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.649557 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.649569 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.649589 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.649603 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.752372 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.752420 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.752431 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.752448 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.752462 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.856006 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.856060 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.856077 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.856100 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.856118 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.959780 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.959838 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.959855 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.959882 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:50 crc kubenswrapper[4777]: I0216 21:38:50.959900 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:50Z","lastTransitionTime":"2026-02-16T21:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.063544 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.063664 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.063696 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.063768 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.063795 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.144110 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 12:40:43.75422063 +0000 UTC Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.166997 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.167046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.167062 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.167089 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.167120 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.182849 4777 scope.go:117] "RemoveContainer" containerID="ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.270491 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.270577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.270601 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.270634 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.270657 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.374762 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.374818 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.374836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.374863 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.374882 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.477490 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.477546 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.477557 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.477580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.477594 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.580280 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.580568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.580577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.580593 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.580603 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.601050 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/1.log" Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.605579 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2"} Feb 16 21:38:51 crc kubenswrapper[4777]: I0216 21:38:51.607758 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.633791 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.650651 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.671245 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683200 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683504 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683522 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683530 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683545 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.683554 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.694160 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.705828 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.720552 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.730822 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.741594 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.755702 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.768167 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.779860 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.785901 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.785931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.785940 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.785956 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.785975 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:51Z","lastTransitionTime":"2026-02-16T21:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.790469 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.802230 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.811734 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:51.823546 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:51Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.461591 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 01:12:26.070083982 +0000 UTC Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.462545 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.462683 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.462746 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.462907 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463117 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463196 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463286 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463345 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463393 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463432 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463460 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463487 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463502 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463533 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:39:24.46350517 +0000 UTC m=+85.046006282 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463585 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463625 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.463578 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463665 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:24.463641074 +0000 UTC m=+85.046142186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463699 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:24.463682995 +0000 UTC m=+85.046184097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463640 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463763 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463779 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.463853 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:24.463814928 +0000 UTC m=+85.046316240 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464017 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464059 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464083 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464097 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464064 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:08.464052065 +0000 UTC m=+69.046553417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:38:52 crc kubenswrapper[4777]: E0216 21:38:52.464200 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:24.464153208 +0000 UTC m=+85.046654540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.468054 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.468088 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.468099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.468117 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.468129 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.572092 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.572136 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.572146 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.572169 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.572184 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.675479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.675536 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.675552 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.675578 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.675592 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.778389 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.778456 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.778471 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.778493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.778506 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.881053 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.881095 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.881106 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.881127 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.881138 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.984291 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.984355 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.984369 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.984393 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:52 crc kubenswrapper[4777]: I0216 21:38:52.984411 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:52Z","lastTransitionTime":"2026-02-16T21:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.087362 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.087435 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.087454 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.087483 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.087505 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.191800 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.191865 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.191879 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.191899 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.191913 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.255275 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.272412 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.280996 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.295063 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.295136 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.295154 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.295183 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.295202 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.304557 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.338391 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.361149 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.391949 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.397852 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.397925 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.397944 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.397969 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.397986 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.409153 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.425317 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.438063 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.456508 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.462827 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 08:36:49.20760427 +0000 UTC Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.472006 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.489238 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.500962 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.501033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.501046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.501073 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.501092 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.505901 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.522932 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.540773 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.574287 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.591701 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.604192 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.604240 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.604249 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.604269 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.604281 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.614616 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/2.log" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.615518 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/1.log" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.619512 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2" exitCode=1 Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.619569 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.619629 4777 scope.go:117] "RemoveContainer" containerID="ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.620682 4777 scope.go:117] "RemoveContainer" containerID="9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2" Feb 16 21:38:53 crc kubenswrapper[4777]: E0216 21:38:53.620914 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.638826 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.659838 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.679941 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.700352 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.715199 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.715250 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.715262 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.715284 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.715297 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.734247 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.747256 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.768570 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.787927 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818214 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818642 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818688 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818701 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.818774 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.833471 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.850243 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.873314 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.896188 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.916533 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.921613 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.921682 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.921701 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.921756 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.921776 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:53Z","lastTransitionTime":"2026-02-16T21:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.935109 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.950189 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:53 crc kubenswrapper[4777]: I0216 21:38:53.972533 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:53Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.025138 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.025208 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.025227 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.025252 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.025267 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.128165 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.128229 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.128247 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.128270 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.128289 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.181184 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.181248 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.181281 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:54 crc kubenswrapper[4777]: E0216 21:38:54.181457 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.181496 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:54 crc kubenswrapper[4777]: E0216 21:38:54.181673 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:54 crc kubenswrapper[4777]: E0216 21:38:54.181880 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:54 crc kubenswrapper[4777]: E0216 21:38:54.182088 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.231895 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.231962 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.231980 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.232008 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.232028 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.335641 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.335702 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.335734 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.335757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.335772 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.439222 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.439298 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.439323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.439355 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.439378 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.463654 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:01:24.623008217 +0000 UTC Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.543608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.543740 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.543760 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.543788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.543805 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.624582 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/2.log" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.645896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.645931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.645941 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.645955 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.645965 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.749283 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.749342 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.749361 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.749385 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.749403 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.851832 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.851900 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.851919 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.851950 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.851970 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.955428 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.955510 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.955528 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.955557 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:54 crc kubenswrapper[4777]: I0216 21:38:54.955581 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:54Z","lastTransitionTime":"2026-02-16T21:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.059228 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.059300 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.059317 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.059374 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.059402 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.164119 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.164173 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.164185 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.164207 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.164225 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.267480 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.267537 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.267549 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.267569 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.267584 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.371223 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.371305 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.371323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.371349 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.371367 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.464820 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:26:27.710870226 +0000 UTC Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.474992 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.475059 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.475079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.475106 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.475128 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.578065 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.578110 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.578120 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.578136 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.578146 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.681233 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.681283 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.681301 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.681326 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.681343 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.784563 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.784646 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.784666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.784692 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.784758 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.888128 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.888195 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.888212 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.888238 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.888261 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.992117 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.992210 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.992292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.992323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:55 crc kubenswrapper[4777]: I0216 21:38:55.992343 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:55Z","lastTransitionTime":"2026-02-16T21:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.096254 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.096336 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.096354 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.096809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.096861 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.181603 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.181603 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.181756 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.181858 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.181976 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.182142 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.182292 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.182444 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.202427 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.202472 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.202483 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.202501 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.202513 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.306174 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.306258 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.306273 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.306306 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.306319 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.409211 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.409288 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.409310 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.409355 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.409379 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.465227 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:36:20.660301838 +0000 UTC Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.513292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.513352 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.513370 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.513395 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.513415 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.616707 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.616824 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.616843 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.616868 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.616886 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.720479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.720535 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.720557 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.720583 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.720602 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.824134 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.824205 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.824228 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.824261 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.824287 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.865204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.865286 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.865309 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.865341 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.865365 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.890471 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:56Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.896302 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.896357 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.896374 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.896399 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.896419 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.920221 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:56Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.926113 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.926191 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.926209 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.926229 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.926243 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.945574 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:56Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.950500 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.950562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.950580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.950603 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.950621 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.973945 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:56Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.978981 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.979171 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.979206 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.979246 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:56 crc kubenswrapper[4777]: I0216 21:38:56.979272 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:56Z","lastTransitionTime":"2026-02-16T21:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.999571 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:38:56Z is after 2025-08-24T17:21:41Z" Feb 16 21:38:56 crc kubenswrapper[4777]: E0216 21:38:56.999834 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.002248 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.002311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.002329 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.002358 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.002385 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.105851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.105922 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.105940 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.105968 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.105990 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.210330 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.210397 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.210409 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.210432 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.210467 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.314622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.314691 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.314710 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.314774 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.314796 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.417896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.417965 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.417985 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.418014 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.418047 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.465887 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 07:44:19.885667752 +0000 UTC Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.521978 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.522039 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.522058 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.522087 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.522106 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.625607 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.626622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.626897 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.627065 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.627203 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.730291 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.730353 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.730371 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.730398 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.730416 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.833611 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.833693 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.833754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.833784 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.833803 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.937209 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.937275 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.937292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.937317 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:57 crc kubenswrapper[4777]: I0216 21:38:57.937339 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:57Z","lastTransitionTime":"2026-02-16T21:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.040479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.041203 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.041242 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.041274 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.041293 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.145422 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.145519 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.145537 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.145562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.145580 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.180847 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.180989 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.180989 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:38:58 crc kubenswrapper[4777]: E0216 21:38:58.181175 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:38:58 crc kubenswrapper[4777]: E0216 21:38:58.181300 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:38:58 crc kubenswrapper[4777]: E0216 21:38:58.181491 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.181838 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:38:58 crc kubenswrapper[4777]: E0216 21:38:58.182231 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.248769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.248836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.248855 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.248878 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.248896 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.352329 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.352825 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.352845 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.352873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.352943 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.456666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.456999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.457083 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.457215 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.457306 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.467004 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 07:10:19.559658219 +0000 UTC Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.561579 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.561649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.561668 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.561695 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.561776 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.665472 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.666467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.666628 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.666815 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.666981 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.770599 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.771025 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.771182 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.771321 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.771454 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.874096 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.874158 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.874180 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.874209 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.874234 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.977081 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.977459 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.977640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.977852 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:58 crc kubenswrapper[4777]: I0216 21:38:58.978027 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:58Z","lastTransitionTime":"2026-02-16T21:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.081366 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.081791 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.081955 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.082116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.082251 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.185291 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.185810 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.186015 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.186212 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.186476 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.290433 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.290917 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.291101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.291288 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.291457 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.395406 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.395476 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.395494 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.395526 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.395551 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.467566 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:47:35.43117105 +0000 UTC Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.498755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.499146 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.499341 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.499503 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.499673 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.603908 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.603981 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.604005 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.604040 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.604064 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.707458 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.708305 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.708481 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.708632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.708966 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.812453 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.812508 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.812519 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.812539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.812552 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.915928 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.916223 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.916248 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.916276 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:38:59 crc kubenswrapper[4777]: I0216 21:38:59.916296 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:38:59Z","lastTransitionTime":"2026-02-16T21:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.020017 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.020580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.020773 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.020936 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.021068 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.123608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.123696 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.123756 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.123793 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.123817 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.180891 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.180971 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.181197 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:00 crc kubenswrapper[4777]: E0216 21:39:00.181586 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.181968 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:00 crc kubenswrapper[4777]: E0216 21:39:00.182216 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:00 crc kubenswrapper[4777]: E0216 21:39:00.182312 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:00 crc kubenswrapper[4777]: E0216 21:39:00.182472 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.204262 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.226324 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.226389 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.226408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.226433 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.226452 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.236218 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce23a1f9e9b4646d672b47a9abd2eff33a162fe0b87c839881273b5d51adb624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"message\\\":\\\"d *v1.EgressIP event handler 8\\\\nI0216 21:38:34.655791 6212 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:38:34.655795 6212 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655811 6212 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0216 21:38:34.655823 6212 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:38:34.655851 6212 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:34.655875 6212 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:34.655902 6212 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:34.655914 6212 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:34.655903 6212 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0216 21:38:34.655937 6212 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:34.655929 6212 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:38:34.655956 6212 factory.go:656] Stopping watch factory\\\\nI0216 21:38:34.655977 6212 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:38:34.656052 6212 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:34.656086 6212 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:34.656143 6212 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.254557 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.274788 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.300407 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.328983 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.329950 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.330054 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.330081 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.330111 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.330135 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.347191 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.374156 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.395701 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.414923 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432314 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432752 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432801 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.432849 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.445613 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.462290 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.468000 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:57:44.419478994 +0000 UTC Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.481362 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.500529 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.517473 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.535164 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.535222 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.535240 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.535268 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.535287 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.536901 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:00Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.638349 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.638412 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.638420 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.638456 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.638470 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.741589 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.741659 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.741677 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.741706 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.741782 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.844779 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.844835 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.844847 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.844871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.844889 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.948362 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.948412 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.948422 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.948439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:00 crc kubenswrapper[4777]: I0216 21:39:00.948452 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:00Z","lastTransitionTime":"2026-02-16T21:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.051465 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.051515 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.051523 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.051540 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.051551 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.155582 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.155657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.155674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.155702 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.155770 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.259157 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.259244 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.259274 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.259314 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.259342 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.362914 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.362980 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.363001 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.363028 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.363052 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.466790 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.466859 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.466878 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.466912 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.466934 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.469019 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 04:30:08.837132514 +0000 UTC Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.570334 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.570414 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.570438 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.570474 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.570496 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.673821 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.673887 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.673909 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.673940 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.673966 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.778421 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.778477 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.778491 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.778512 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.778526 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.881364 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.881429 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.881451 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.881480 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.881503 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.984408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.984471 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.984489 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.984515 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:01 crc kubenswrapper[4777]: I0216 21:39:01.984534 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:01Z","lastTransitionTime":"2026-02-16T21:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.087966 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.088030 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.088047 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.088073 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.088093 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.181064 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.181167 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.181104 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.181064 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:02 crc kubenswrapper[4777]: E0216 21:39:02.181394 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:02 crc kubenswrapper[4777]: E0216 21:39:02.181522 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:02 crc kubenswrapper[4777]: E0216 21:39:02.181665 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:02 crc kubenswrapper[4777]: E0216 21:39:02.181823 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.199632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.200033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.200163 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.200230 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.200260 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.303928 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.304000 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.304020 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.304053 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.304077 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.407160 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.407235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.407253 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.407279 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.407298 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.469537 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:51:24.602215292 +0000 UTC Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.723623 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.723657 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.723666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.723681 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.723691 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.826168 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.826206 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.826218 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.826235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.826247 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.928331 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.928425 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.928450 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.928487 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:02 crc kubenswrapper[4777]: I0216 21:39:02.928513 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:02Z","lastTransitionTime":"2026-02-16T21:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.031929 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.032017 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.032033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.032053 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.032066 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.198710 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.198816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.198841 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.198872 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.198898 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.300666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.300760 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.300779 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.300805 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.300824 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.403324 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.403381 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.403392 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.403407 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.403417 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.470658 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 03:07:44.019074747 +0000 UTC Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.506681 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.506818 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.506851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.506889 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.506919 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.610651 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.610755 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.610780 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.610813 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.610840 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.713816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.713869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.713887 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.713916 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.713936 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.817670 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.817767 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.817787 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.817820 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.817839 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.925960 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.926037 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.926060 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.926091 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:03 crc kubenswrapper[4777]: I0216 21:39:03.926113 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:03Z","lastTransitionTime":"2026-02-16T21:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.029831 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.029951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.029976 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.030013 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.030039 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.133373 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.133454 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.133478 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.133511 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.133535 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.181876 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.182017 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.181895 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:04 crc kubenswrapper[4777]: E0216 21:39:04.182102 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.182132 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:04 crc kubenswrapper[4777]: E0216 21:39:04.182487 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:04 crc kubenswrapper[4777]: E0216 21:39:04.182563 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:04 crc kubenswrapper[4777]: E0216 21:39:04.182756 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.237404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.237467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.237487 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.237513 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.237532 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.341159 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.341204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.341216 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.341235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.341248 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.444553 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.444638 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.444659 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.444771 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.444793 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.471416 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:22:02.593918674 +0000 UTC Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.547556 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.547616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.547633 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.547658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.547684 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.651015 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.651079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.651096 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.651125 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.651145 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.753785 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.753848 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.753865 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.753889 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.753906 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.856742 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.856807 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.856820 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.856844 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.856858 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.959470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.959509 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.959517 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.959564 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:04 crc kubenswrapper[4777]: I0216 21:39:04.959574 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:04Z","lastTransitionTime":"2026-02-16T21:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.063162 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.063231 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.063243 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.063264 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.063277 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.165905 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.165968 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.165989 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.166018 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.166046 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.188616 4777 scope.go:117] "RemoveContainer" containerID="9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2" Feb 16 21:39:05 crc kubenswrapper[4777]: E0216 21:39:05.188954 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.208626 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.226317 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.243634 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.257845 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.269504 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.269565 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.269585 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.269610 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.269628 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.278985 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.302160 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.332255 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.347309 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.361309 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.372273 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.372311 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.372321 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.372338 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.372349 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.380969 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.403620 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.418189 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.438341 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.456221 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.471612 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:11:48.456365694 +0000 UTC Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.472889 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.475024 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.475080 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.475090 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.475107 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.475120 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.487539 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.503544 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:05Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.577636 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.577695 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.577705 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.577750 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.577763 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.681617 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.681696 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.681743 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.681771 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.681788 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.785136 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.785218 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.785242 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.785271 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.785291 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.888532 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.888600 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.888619 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.888644 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.888664 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.991829 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.991893 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.991911 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.991940 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:05 crc kubenswrapper[4777]: I0216 21:39:05.991958 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:05Z","lastTransitionTime":"2026-02-16T21:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.094614 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.094663 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.094674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.094690 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.094700 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.181682 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.181868 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.181977 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:06 crc kubenswrapper[4777]: E0216 21:39:06.182001 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:06 crc kubenswrapper[4777]: E0216 21:39:06.182139 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.182233 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:06 crc kubenswrapper[4777]: E0216 21:39:06.182394 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:06 crc kubenswrapper[4777]: E0216 21:39:06.182537 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.197511 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.197568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.197580 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.197600 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.197613 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.300924 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.300999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.301022 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.301051 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.301070 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.403951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.404026 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.404037 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.404056 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.404070 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.471756 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 06:47:25.873757779 +0000 UTC Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.506876 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.506945 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.506957 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.506974 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.507015 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.610056 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.610101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.610112 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.610129 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.610140 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.713836 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.713906 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.713932 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.713963 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.713987 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.816413 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.816481 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.816493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.816510 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.816520 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.919140 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.919192 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.919208 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.919261 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:06 crc kubenswrapper[4777]: I0216 21:39:06.919279 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:06Z","lastTransitionTime":"2026-02-16T21:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.025042 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.025134 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.025156 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.025181 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.025205 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.271608 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.271984 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.275100 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.275131 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.275140 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.275157 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.275167 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.276609 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.276631 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.276640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.276649 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.276657 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.291032 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:07Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.295822 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.295873 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.295887 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.295907 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.295920 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.312132 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:07Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.316871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.316925 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.316935 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.316952 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.316965 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.329774 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:07Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.334432 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.334473 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.334486 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.334504 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.334518 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.348161 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:07Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.353360 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.353406 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.353439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.353474 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.353548 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.373701 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:07Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:07 crc kubenswrapper[4777]: E0216 21:39:07.374070 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.378872 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.378915 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.378927 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.378951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.378964 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.472697 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 09:21:00.52706791 +0000 UTC Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.481077 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.481132 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.481149 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.481177 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.481199 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.584316 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.584370 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.584382 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.584402 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.584414 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.687549 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.687596 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.687605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.687622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.687632 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.790757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.790808 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.790821 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.790857 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.790873 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.893573 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.893623 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.893632 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.893652 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.893664 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.996377 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.996436 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.996450 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.996470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:07 crc kubenswrapper[4777]: I0216 21:39:07.996483 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:07Z","lastTransitionTime":"2026-02-16T21:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.098946 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.098997 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.099007 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.099025 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.099036 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.181219 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:08 crc kubenswrapper[4777]: E0216 21:39:08.181423 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.181489 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.181569 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:08 crc kubenswrapper[4777]: E0216 21:39:08.181684 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:08 crc kubenswrapper[4777]: E0216 21:39:08.181840 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.200627 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.200689 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.200732 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.200757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.200772 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.304294 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.304384 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.304408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.304440 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.304462 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.407577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.407647 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.407664 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.407699 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.407761 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.473796 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:39:58.827665673 +0000 UTC Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.483846 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:08 crc kubenswrapper[4777]: E0216 21:39:08.484123 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:39:08 crc kubenswrapper[4777]: E0216 21:39:08.484232 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:39:40.484207704 +0000 UTC m=+101.066708846 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.512296 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.512378 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.512401 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.512434 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.512457 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.616358 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.616437 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.616464 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.616493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.616512 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.723656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.723750 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.723766 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.723790 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.723878 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.826404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.826461 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.826481 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.826507 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.826524 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.929394 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.929442 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.929453 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.929471 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:08 crc kubenswrapper[4777]: I0216 21:39:08.929484 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:08Z","lastTransitionTime":"2026-02-16T21:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.032470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.032530 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.032564 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.032584 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.032597 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.135134 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.135185 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.135196 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.135215 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.135229 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.181534 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:09 crc kubenswrapper[4777]: E0216 21:39:09.181783 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.238175 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.238229 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.238245 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.238270 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.238286 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.340817 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.340883 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.340903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.340930 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.340948 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.444967 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.445034 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.445057 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.445088 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.445114 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.474636 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 07:54:22.913082035 +0000 UTC Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.547605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.547664 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.547682 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.547753 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.547780 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.650416 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.650463 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.650473 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.650490 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.650500 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752431 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752485 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752508 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752521 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.752969 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/0.log" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.753011 4777 generic.go:334] "Generic (PLEG): container finished" podID="71656da7-4f33-419d-aaba-93bf9158f706" containerID="9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de" exitCode=1 Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.753036 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerDied","Data":"9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.753386 4777 scope.go:117] "RemoveContainer" containerID="9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.766998 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.788143 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.807834 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.819318 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.830329 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.843782 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.854788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.854809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.854819 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.854834 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.854850 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.858992 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.875684 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.889861 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.906148 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.920775 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.943372 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.957882 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.957935 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.957948 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.957971 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.957984 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:09Z","lastTransitionTime":"2026-02-16T21:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.962097 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.978386 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:09 crc kubenswrapper[4777]: I0216 21:39:09.992551 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:09Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.016451 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.053259 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.059824 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.059897 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.059910 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.059931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.059943 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.162852 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.162892 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.162903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.162920 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.162931 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.181914 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.181988 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:10 crc kubenswrapper[4777]: E0216 21:39:10.182076 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:10 crc kubenswrapper[4777]: E0216 21:39:10.182197 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.182306 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:10 crc kubenswrapper[4777]: E0216 21:39:10.182414 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.198128 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.213992 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.230503 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.245003 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.262002 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.266662 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.266728 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.266740 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.266758 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.266769 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.276232 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.299545 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.315590 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.329230 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.348974 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.362296 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.369531 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.369576 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.369587 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.369605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.369620 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.379536 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.394923 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.411320 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.426992 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.455005 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.472393 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.473045 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.473127 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.473151 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.473182 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.473378 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.475132 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:47:10.075287282 +0000 UTC Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.577180 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.577444 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.577544 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.577628 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.577697 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.680762 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.681047 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.681124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.681190 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.681257 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.758687 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/0.log" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.758943 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerStarted","Data":"0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.773829 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.783814 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.783852 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.783861 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.783876 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.783887 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.788794 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.801864 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.812668 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.823148 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.837252 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.850108 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.862208 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.870795 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.882420 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.886075 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.886193 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.886288 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.886408 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.886485 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.897022 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.914586 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.927532 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.940329 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.958377 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.987639 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.990308 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.990369 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.990391 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.990419 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:10 crc kubenswrapper[4777]: I0216 21:39:10.990438 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:10Z","lastTransitionTime":"2026-02-16T21:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:10.999931 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:10Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.093253 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.093302 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.093314 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.093332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.093343 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.181376 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:11 crc kubenswrapper[4777]: E0216 21:39:11.181531 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.196288 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.196328 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.196340 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.196358 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.196370 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.298653 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.298686 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.298695 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.298728 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.298740 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.401258 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.401325 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.401346 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.401372 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.401395 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.475304 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:50:30.12971424 +0000 UTC Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.504094 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.504173 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.504191 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.504217 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.504253 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.607587 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.607626 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.607639 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.607658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.607671 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.712430 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.712475 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.712487 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.712505 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.712517 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.815091 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.815154 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.815176 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.815219 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.815244 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.918004 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.918062 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.918074 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.918098 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:11 crc kubenswrapper[4777]: I0216 21:39:11.918112 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:11Z","lastTransitionTime":"2026-02-16T21:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.020382 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.020449 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.020467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.020497 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.020516 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.123345 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.123404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.123420 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.123445 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.123463 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.181202 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.181222 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:12 crc kubenswrapper[4777]: E0216 21:39:12.181371 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:12 crc kubenswrapper[4777]: E0216 21:39:12.181584 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.181673 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:12 crc kubenswrapper[4777]: E0216 21:39:12.182158 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.226640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.226685 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.226707 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.226753 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.226767 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.329603 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.329669 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.329685 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.329704 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.329752 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.432589 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.432641 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.432650 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.432668 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.432677 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.475704 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 18:56:53.273289077 +0000 UTC Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.534729 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.534781 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.534790 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.534808 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.534819 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.637574 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.637646 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.637665 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.637694 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.637743 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.740132 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.740203 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.740221 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.740248 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.740267 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.844707 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.844792 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.844805 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.844829 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.844847 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.947449 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.947506 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.947524 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.947550 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:12 crc kubenswrapper[4777]: I0216 21:39:12.947573 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:12Z","lastTransitionTime":"2026-02-16T21:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.050147 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.050211 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.050228 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.050254 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.050272 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.152878 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.152965 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.152984 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.153040 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.153058 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.181063 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:13 crc kubenswrapper[4777]: E0216 21:39:13.181225 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.255300 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.255764 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.255833 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.255894 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.255947 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.359268 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.359591 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.359660 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.359748 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.359814 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.462834 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.462897 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.462911 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.462931 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.462944 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.476336 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:18:08.107585761 +0000 UTC Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.565933 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.565976 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.565988 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.566009 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.566021 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.668191 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.668244 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.668255 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.668274 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.668285 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.770666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.770742 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.770757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.770780 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.770792 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.873249 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.873342 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.873363 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.873395 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.873419 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.976290 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.976322 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.976330 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.976345 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:13 crc kubenswrapper[4777]: I0216 21:39:13.976355 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:13Z","lastTransitionTime":"2026-02-16T21:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.079752 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.079823 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.079842 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.079869 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.079889 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.180777 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.180812 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:14 crc kubenswrapper[4777]: E0216 21:39:14.180912 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:14 crc kubenswrapper[4777]: E0216 21:39:14.181113 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.181326 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:14 crc kubenswrapper[4777]: E0216 21:39:14.181620 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.183239 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.183327 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.183346 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.183401 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.183421 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.286691 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.286795 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.286821 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.286849 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.286872 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.390156 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.390213 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.390231 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.390254 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.390273 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.477465 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 20:46:10.955093347 +0000 UTC Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.493998 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.494047 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.494069 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.494094 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.494115 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.596919 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.596980 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.596998 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.597027 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.597047 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.700024 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.700110 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.700129 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.700153 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.700171 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.802604 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.802667 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.802690 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.802756 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.802780 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.905704 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.905770 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.905779 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.905810 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:14 crc kubenswrapper[4777]: I0216 21:39:14.905820 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:14Z","lastTransitionTime":"2026-02-16T21:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.008418 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.008497 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.008533 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.008568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.008590 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.112149 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.112197 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.112210 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.112231 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.112244 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.181160 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:15 crc kubenswrapper[4777]: E0216 21:39:15.181395 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.214444 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.214553 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.214579 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.214607 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.214628 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.317675 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.317768 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.317810 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.317833 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.317849 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.420867 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.420912 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.420923 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.420940 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.420953 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.478032 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:18:57.694239679 +0000 UTC Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.524363 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.524464 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.524487 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.524552 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.524572 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.628127 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.628272 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.628297 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.628333 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.628352 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.739375 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.739489 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.739512 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.739540 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.739564 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.843633 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.843749 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.843776 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.843811 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.843838 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.951157 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.951225 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.951239 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.951263 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:15 crc kubenswrapper[4777]: I0216 21:39:15.951278 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:15Z","lastTransitionTime":"2026-02-16T21:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.054895 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.054972 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.054992 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.055020 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.055039 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.158902 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.158971 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.158989 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.159016 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.159036 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.181101 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.181142 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.181247 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:16 crc kubenswrapper[4777]: E0216 21:39:16.181431 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:16 crc kubenswrapper[4777]: E0216 21:39:16.181756 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:16 crc kubenswrapper[4777]: E0216 21:39:16.181838 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.262294 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.262362 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.262383 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.262412 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.262436 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.366015 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.366093 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.366112 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.366141 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.366162 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.469884 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.469951 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.469969 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.469994 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.470015 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.478590 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 13:11:14.615558666 +0000 UTC Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.574082 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.574155 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.574180 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.574214 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.574239 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.677757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.677816 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.677835 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.677859 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.677879 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.780627 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.780902 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.780975 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.781005 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.782340 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.884793 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.884860 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.884885 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.884917 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.884941 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.988605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.988654 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.988671 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.988695 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:16 crc kubenswrapper[4777]: I0216 21:39:16.988735 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:16Z","lastTransitionTime":"2026-02-16T21:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.091781 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.091838 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.091855 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.091878 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.091896 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.180812 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.181056 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.194533 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.194575 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.194591 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.194611 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.194629 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.297999 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.298058 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.298076 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.298102 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.298121 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.401041 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.401160 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.401188 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.401216 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.401236 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.479519 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 15:33:00.322318255 +0000 UTC Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.504348 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.504414 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.504439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.504472 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.504493 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.608608 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.608666 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.608682 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.608708 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.608763 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.646537 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.646591 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.646603 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.646622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.646633 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.667230 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:17Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.672505 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.672567 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.672588 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.672617 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.672637 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.692763 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:17Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.698486 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.698561 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.698582 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.699163 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.699211 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.725252 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:17Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.730445 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.730520 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.730544 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.730612 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.730666 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.754868 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:17Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.760621 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.760709 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.760747 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.760774 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.760792 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.785153 4777 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd88e8c4-6b8b-421e-820d-c535c131d8af\\\",\\\"systemUUID\\\":\\\"4b23e571-272d-4c87-821c-0e1a2dceb613\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:17Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:17 crc kubenswrapper[4777]: E0216 21:39:17.785454 4777 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.787519 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.787573 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.787593 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.787617 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.787635 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.891470 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.891746 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.891772 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.891802 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.891822 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.995748 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.995815 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.995833 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.995861 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:17 crc kubenswrapper[4777]: I0216 21:39:17.995880 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:17Z","lastTransitionTime":"2026-02-16T21:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.099432 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.099491 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.099509 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.099535 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.099556 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.181821 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.181937 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:18 crc kubenswrapper[4777]: E0216 21:39:18.182050 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:18 crc kubenswrapper[4777]: E0216 21:39:18.182258 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.183028 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:18 crc kubenswrapper[4777]: E0216 21:39:18.183456 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.202782 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.202860 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.202879 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.202903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.202921 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.306046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.306097 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.306116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.306141 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.306159 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.409233 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.409271 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.409281 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.409298 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.409310 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.480341 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:45:11.160067172 +0000 UTC Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.512419 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.512475 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.512491 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.512520 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.512539 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.615956 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.616053 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.616073 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.616099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.616116 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.719678 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.719766 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.719788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.719823 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.719846 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.822849 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.822896 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.822916 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.822939 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.822957 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.927824 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.927876 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.927893 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.927917 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:18 crc kubenswrapper[4777]: I0216 21:39:18.927935 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:18Z","lastTransitionTime":"2026-02-16T21:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.031020 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.031096 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.031116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.031145 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.031169 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.134272 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.134360 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.134380 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.134411 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.134431 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.181635 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:19 crc kubenswrapper[4777]: E0216 21:39:19.181862 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.237601 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.237655 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.237663 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.237683 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.237696 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.341079 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.341144 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.341151 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.341167 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.341177 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.444063 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.444124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.444144 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.444169 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.444188 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.480999 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 14:48:24.674973465 +0000 UTC Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.547204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.547262 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.547278 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.547302 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.547321 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.650670 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.650767 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.650788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.650837 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.650866 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.754663 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.754773 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.754795 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.754823 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.754840 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.857837 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.857879 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.857887 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.857903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.857912 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.961285 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.961346 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.961363 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.961387 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:19 crc kubenswrapper[4777]: I0216 21:39:19.961405 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:19Z","lastTransitionTime":"2026-02-16T21:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.065208 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.065292 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.065319 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.065352 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.065380 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.168033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.168200 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.168232 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.168264 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.168302 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.180948 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.180992 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.181060 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:20 crc kubenswrapper[4777]: E0216 21:39:20.182273 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:20 crc kubenswrapper[4777]: E0216 21:39:20.182786 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:20 crc kubenswrapper[4777]: E0216 21:39:20.183023 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.188148 4777 scope.go:117] "RemoveContainer" containerID="9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.207121 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.228876 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.256457 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.271371 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.273806 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.273991 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.274134 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.274266 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.275262 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.298041 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.319304 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.337879 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.355240 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.373092 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.377216 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.377276 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.377299 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.377328 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.377347 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.390345 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.408224 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.425518 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.441559 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.455067 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.470606 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.479678 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.479749 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.479771 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.479795 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.479815 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.481603 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:52:08.569155064 +0000 UTC Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.506672 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.522952 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.583602 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.583662 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.583684 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.583756 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.583787 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.685880 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.685944 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.685966 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.685996 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.686018 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.789021 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.789082 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.789099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.789121 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.789137 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.800212 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/2.log" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.803615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.804118 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.831784 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.847407 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.865219 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.883105 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.891988 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.892230 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.892351 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.892460 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.892560 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.899862 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.921730 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.944536 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.960583 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.976037 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.994327 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:20Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.997467 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.997505 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.997515 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.997536 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:20 crc kubenswrapper[4777]: I0216 21:39:20.997547 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:20Z","lastTransitionTime":"2026-02-16T21:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.010668 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.023359 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.048643 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.070129 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.082732 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.095995 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.099644 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.099741 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.099757 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.099775 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.099790 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.112483 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.181223 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:21 crc kubenswrapper[4777]: E0216 21:39:21.181396 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.202479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.202514 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.202524 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.202539 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.202551 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.305069 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.305108 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.305120 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.305135 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.305145 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.407674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.407735 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.407749 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.407768 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.407783 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.481895 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:35:44.517698494 +0000 UTC Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.510538 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.510602 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.510621 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.510647 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.510667 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.614116 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.614210 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.614235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.614272 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.614294 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.717585 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.717644 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.717674 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.717700 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.717739 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.810485 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/3.log" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.811100 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/2.log" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.814308 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" exitCode=1 Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.814377 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.814457 4777 scope.go:117] "RemoveContainer" containerID="9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.815053 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:39:21 crc kubenswrapper[4777]: E0216 21:39:21.815246 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.819257 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.819304 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.819320 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.819346 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.819363 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.836055 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.852390 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8tx24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a072a4b3-dbc6-4e52-8a35-2e67069603c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899a2c6da005a2e42878fdf435219a8de3b403da4ca8e28e745b088f20bb9d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6hc9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8tx24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.866001 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbd6cb2a-0e80-4642-ad1e-993774971496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea3d7ec30e157341d4ba8c533e9fa680561d723b59607b11d8bb2ca62cfd5f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjqvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h78cj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.880811 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4e8a841-543c-4825-9320-d66b0bc2438e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-16T21:38:19Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0216 21:38:13.777106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0216 21:38:13.781373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1219691439/tls.crt::/tmp/serving-cert-1219691439/tls.key\\\\\\\"\\\\nI0216 21:38:19.904336 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0216 21:38:19.908085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0216 21:38:19.908106 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0216 21:38:19.908126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0216 21:38:19.908131 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0216 21:38:19.919734 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0216 21:38:19.919770 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919777 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0216 21:38:19.919787 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0216 21:38:19.919792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0216 21:38:19.919798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0216 21:38:19.919805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0216 21:38:19.920048 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0216 21:38:19.921780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.894747 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddb4b68f0bcfa55ea27c123ee5152cd9ee16efb8f6949cdaf79f8e8a889e8144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.906861 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b6c3c71-4486-4eae-8aae-424892a1d703\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24b8275b9a68ccb0d89ab40c6532108c5f39704e49f3ddce4aa16c81dd37d842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://338dc3a345c67ca6183da812bebbd0f5347cb4a8c5059bb2fb7ee0c21f6c7c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjjhd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-npnnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.920365 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.924035 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.924117 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.924131 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.924153 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.924167 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:21Z","lastTransitionTime":"2026-02-16T21:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.939093 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b488489aac123d7d31333826b48debc93ffb6fb778a7837c24c9acaa47642e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.958900 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3c293d7-2d38-4047-a104-7f354aebf216\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9978e3783d4b773ad483f33f9b3a453fe23cadf34ec3150f18992c981b5708d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"message\\\":\\\":default/a8519615025667110816) with []\\\\nI0216 21:38:52.999210 6424 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0216 21:38:52.999277 6424 factory.go:1336] Added *v1.Node event handler 7\\\\nI0216 21:38:52.999310 6424 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999312 6424 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0216 21:38:52.999332 6424 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0216 21:38:52.999369 6424 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:38:52.999368 6424 handler.go:208] Removed *v1.Node event handler 2\\\\nI0216 21:38:52.999395 6424 handler.go:208] Removed *v1.Node event handler 7\\\\nI0216 21:38:52.999426 6424 factory.go:656] Stopping watch factory\\\\nI0216 21:38:52.999454 6424 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:38:52.999948 6424 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:38:53.000052 6424 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:38:53.000094 6424 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:38:53.000124 6424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:38:53.000201 6424 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:21Z\\\",\\\"message\\\":\\\" handler 6\\\\nI0216 21:39:21.169911 6837 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0216 21:39:21.169965 6837 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0216 21:39:21.170004 6837 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0216 21:39:21.170015 6837 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0216 21:39:21.170024 6837 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0216 21:39:21.170031 6837 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0216 21:39:21.170083 6837 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0216 21:39:21.170099 6837 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0216 21:39:21.170088 6837 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0216 21:39:21.170155 6837 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0216 21:39:21.170163 6837 factory.go:656] Stopping watch factory\\\\nI0216 21:39:21.170191 6837 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0216 21:39:21.170201 6837 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0216 21:39:21.170225 6837 ovnkube.go:599] Stopped ovnkube\\\\nI0216 21:39:21.170261 6837 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0216 21:39:21.170361 6837 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k2hhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w27qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.975086 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rwm84" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9dpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rwm84\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:21 crc kubenswrapper[4777]: I0216 21:39:21.993965 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9df40036-ebf0-42db-a3c7-66862da49bcb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a5f7e088b21d8907a92a83e39480a53b84fb0c177281e0ee91eac8409f5858\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://090314b845c376f3298674de547ea13a6de12b9dfb56457af52c142701c93a8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84083a9adcea879674a04ba75af8d30d25553064123c03b6a1425b90f08632ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83e99a71ac6c435439df9f16c1ab41eb1f7653d84652942092018d3c3981edd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:21Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.013790 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.028235 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.028300 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.028319 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.028345 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.028364 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.033982 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vpf28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71656da7-4f33-419d-aaba-93bf9158f706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-16T21:39:09Z\\\",\\\"message\\\":\\\"2026-02-16T21:38:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed\\\\n2026-02-16T21:38:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_239be2db-fc8a-4d2e-a9d6-548433db32ed to /host/opt/cni/bin/\\\\n2026-02-16T21:38:24Z [verbose] multus-daemon started\\\\n2026-02-16T21:38:24Z [verbose] Readiness Indicator file check\\\\n2026-02-16T21:39:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5zs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vpf28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.049325 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q2ff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7818180b-37e3-4bbb-bdd6-7ac570c5ea2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d587bf096747e906062b43bd01b6bbb3bf24694f6f64f6cad72d4590f75c9a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rfpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q2ff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.062959 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39852a0e-d4fc-4c16-abfb-ab6974ff5e82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cc369fecc3c56f0c5e30ef017c02cec0f14aeedcde32175776546d56641e938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf2d91dea926edf012e9006ebd53e01fd8b7d826ac98703d68ae1e22d6eba074\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://426c2ae49d2260e211cd2750497ea0c2d22ef3c11fcab66f94ed79e78597bfbf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.075031 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d28640f970567b8ca06cff16f9274a6edb97dc52a5a7ef7c8cd51b799359187a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4815962b730b0796f89651c656d56c153bd816bd9677aefd81918b63116ae7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.088116 4777 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c4c6202-048b-4373-9f44-f5eb0de89993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-16T21:38:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff6e52702d3923d5b816022459e20c2e0e6d64ea55f853c177925a33d0706b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-16T21:38:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45badfbee2bcd8d9db57552c1c447f55ed746c1c4c71396f80a62502388303f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0597445b8edbf428d1d917adc79367f916793a2d45f8206c08d2fdb6feae706f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dab537bc1ba86b7fe203e2da7bc1de94a7f53bfa8d9c6e55455414a56d92b54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7fe0eb832c5af5356285b7b04b2075a3b478cc504d7873eac4b8d2dc7f0fd53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7a0730bb9bbbea73cc9884d50304aefd9fb4284af16e08da0f0fb51b5656a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://127902163c9186bd65b525e708063cbebfad31f9135319815e44e7e082c94d86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-16T21:38:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-16T21:38:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xrdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-16T21:38:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zq5h9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-16T21:39:22Z is after 2025-08-24T17:21:41Z" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.132025 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.132100 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.132117 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.132142 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.132161 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.181933 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.182206 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:22 crc kubenswrapper[4777]: E0216 21:39:22.182379 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.182404 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:22 crc kubenswrapper[4777]: E0216 21:39:22.182553 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:22 crc kubenswrapper[4777]: E0216 21:39:22.182681 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.235769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.235849 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.235871 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.235905 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.235927 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.340568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.340677 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.340702 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.340810 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.340880 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.445672 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.445766 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.445782 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.445807 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.445824 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.483025 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:40:41.786461453 +0000 UTC Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.548526 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.548568 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.548579 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.548598 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.548611 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.652020 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.652067 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.652081 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.652101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.652118 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.756044 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.756130 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.756153 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.756190 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.756212 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.821852 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/3.log" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.827797 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:39:22 crc kubenswrapper[4777]: E0216 21:39:22.828065 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.859887 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.860027 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.860243 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.860440 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.860556 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.862704 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=58.862685775 podStartE2EDuration="58.862685775s" podCreationTimestamp="2026-02-16 21:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:22.86217095 +0000 UTC m=+83.444672162" watchObservedRunningTime="2026-02-16 21:39:22.862685775 +0000 UTC m=+83.445186917" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.912996 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zq5h9" podStartSLOduration=61.912960872 podStartE2EDuration="1m1.912960872s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:22.912790697 +0000 UTC m=+83.495291899" watchObservedRunningTime="2026-02-16 21:39:22.912960872 +0000 UTC m=+83.495462014" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.933487 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q2ff8" podStartSLOduration=61.933458058 podStartE2EDuration="1m1.933458058s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:22.933097538 +0000 UTC m=+83.515598680" watchObservedRunningTime="2026-02-16 21:39:22.933458058 +0000 UTC m=+83.515959200" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.958543 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=62.95851496 podStartE2EDuration="1m2.95851496s" podCreationTimestamp="2026-02-16 21:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:22.958037355 +0000 UTC m=+83.540538507" watchObservedRunningTime="2026-02-16 21:39:22.95851496 +0000 UTC m=+83.541016102" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.964658 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.964769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.964788 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.964811 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:22 crc kubenswrapper[4777]: I0216 21:39:22.964830 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:22Z","lastTransitionTime":"2026-02-16T21:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.031399 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podStartSLOduration=62.031366705 podStartE2EDuration="1m2.031366705s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:23.030682444 +0000 UTC m=+83.613183636" watchObservedRunningTime="2026-02-16 21:39:23.031366705 +0000 UTC m=+83.613867857" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.032108 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8tx24" podStartSLOduration=62.032089536 podStartE2EDuration="1m2.032089536s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:23.016057332 +0000 UTC m=+83.598558494" watchObservedRunningTime="2026-02-16 21:39:23.032089536 +0000 UTC m=+83.614590688" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.067567 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.067622 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.067638 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.067659 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.067676 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.079314 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-npnnx" podStartSLOduration=61.079282282 podStartE2EDuration="1m1.079282282s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:23.078155519 +0000 UTC m=+83.660656681" watchObservedRunningTime="2026-02-16 21:39:23.079282282 +0000 UTC m=+83.661783424" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.095014 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=30.094981177 podStartE2EDuration="30.094981177s" podCreationTimestamp="2026-02-16 21:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:23.094562224 +0000 UTC m=+83.677063376" watchObservedRunningTime="2026-02-16 21:39:23.094981177 +0000 UTC m=+83.677482319" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.123586 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vpf28" podStartSLOduration=62.123566512 podStartE2EDuration="1m2.123566512s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:23.123036366 +0000 UTC m=+83.705537518" watchObservedRunningTime="2026-02-16 21:39:23.123566512 +0000 UTC m=+83.706067634" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.170656 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.170707 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.170745 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.170769 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.170787 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.181070 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:23 crc kubenswrapper[4777]: E0216 21:39:23.181287 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.274426 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.274480 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.274493 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.274517 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.274534 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.378422 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.378469 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.378479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.378494 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.378504 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.482103 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.482195 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.482215 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.482246 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.482266 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.484197 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 23:24:00.899184171 +0000 UTC Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.585469 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.585534 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.585551 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.585577 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.585597 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.689642 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.689708 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.689770 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.689801 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.689824 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.793318 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.793395 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.793417 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.793445 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.793465 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.896998 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.897076 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.897099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.897137 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:23 crc kubenswrapper[4777]: I0216 21:39:23.897162 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:23Z","lastTransitionTime":"2026-02-16T21:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.000944 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.001014 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.001033 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.001058 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.001079 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.106334 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.106380 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.106396 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.106419 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.106437 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.180820 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.181028 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.181381 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.181552 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.181983 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.182138 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.199641 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.210106 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.210170 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.210190 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.210216 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.210235 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.315876 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.315937 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.315954 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.315978 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.315997 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.418848 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.418892 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.418903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.418920 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.418931 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.472410 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.472590 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472660 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:28.472623519 +0000 UTC m=+149.055124661 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.472753 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.472808 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472875 4777 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472881 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472926 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472940 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:28.472924498 +0000 UTC m=+149.055425610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472949 4777 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.472977 4777 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473005 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:28.47298681 +0000 UTC m=+149.055487952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.472875 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473041 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:28.473022311 +0000 UTC m=+149.055523453 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473230 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473283 4777 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473311 4777 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:39:24 crc kubenswrapper[4777]: E0216 21:39:24.473448 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:28.473410962 +0000 UTC m=+149.055912104 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.484861 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 18:35:59.15550325 +0000 UTC Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.522701 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.522824 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.522851 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.522883 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.522908 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.625523 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.625598 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.625616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.625640 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.625657 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.728798 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.728876 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.728899 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.728929 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.728957 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.833903 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.833983 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.834008 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.834041 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.834065 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.937022 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.937101 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.937124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.937152 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:24 crc kubenswrapper[4777]: I0216 21:39:24.937174 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:24Z","lastTransitionTime":"2026-02-16T21:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.040676 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.040768 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.040790 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.040817 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.040837 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.144969 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.145030 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.145046 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.145071 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.145093 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.183214 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:25 crc kubenswrapper[4777]: E0216 21:39:25.183434 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.247601 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.247731 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.247754 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.247783 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.247806 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.352485 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.352562 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.352578 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.352605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.352624 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.456012 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.456099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.456124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.456154 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.456174 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.486024 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:48:21.826056254 +0000 UTC Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.560300 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.560669 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.560733 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.560772 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.560802 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.664117 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.664184 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.664204 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.664231 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.664249 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.767150 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.767219 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.767238 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.767264 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.767285 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.872262 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.872360 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.872378 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.872439 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.872460 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.975008 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.975048 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.975057 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.975073 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:25 crc kubenswrapper[4777]: I0216 21:39:25.975082 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:25Z","lastTransitionTime":"2026-02-16T21:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.078248 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.078332 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.078365 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.078404 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.078427 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181293 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181293 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:26 crc kubenswrapper[4777]: E0216 21:39:26.181438 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:26 crc kubenswrapper[4777]: E0216 21:39:26.181502 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181409 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:26 crc kubenswrapper[4777]: E0216 21:39:26.181590 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181771 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181798 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181809 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181824 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.181837 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.284378 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.284427 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.284437 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.284456 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.284469 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.387240 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.387276 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.387286 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.387301 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.387310 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.486641 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:03:15.613461255 +0000 UTC Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.490103 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.490189 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.490201 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.490221 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.490237 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.593075 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.593141 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.593159 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.593190 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.593210 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.696361 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.696431 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.696443 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.696463 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.696480 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.799605 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.799736 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.799756 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.799780 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.799803 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.902694 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.902778 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.902792 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.902818 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:26 crc kubenswrapper[4777]: I0216 21:39:26.902833 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:26Z","lastTransitionTime":"2026-02-16T21:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.006494 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.006564 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.006583 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.006616 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.006637 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.111359 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.111431 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.111450 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.111479 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.111498 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.181222 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:27 crc kubenswrapper[4777]: E0216 21:39:27.181488 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.215028 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.215099 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.215119 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.215147 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.215166 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.319124 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.319203 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.319227 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.319260 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.319282 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.422109 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.422181 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.422200 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.422233 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.422252 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.487124 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 21:48:03.156760586 +0000 UTC Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.525257 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.525323 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.525340 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.525366 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.525385 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.629506 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.629677 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.629702 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.629838 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.629864 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.733047 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.733121 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.733147 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.733184 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.733210 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.837122 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.837188 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.837202 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.837221 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.837234 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.838629 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.838690 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.838708 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.838765 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 16 21:39:27 crc kubenswrapper[4777]: I0216 21:39:27.838785 4777 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-16T21:39:27Z","lastTransitionTime":"2026-02-16T21:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.181422 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.181544 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.181431 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:28 crc kubenswrapper[4777]: E0216 21:39:28.181673 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:28 crc kubenswrapper[4777]: E0216 21:39:28.181893 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:28 crc kubenswrapper[4777]: E0216 21:39:28.182083 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.488278 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 21:29:20.164679951 +0000 UTC Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.593103 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g"] Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.593766 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.597573 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.597843 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.599130 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.602456 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.727947 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.728044 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.728084 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.728126 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.728212 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829366 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829473 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829576 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829631 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.829890 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.830024 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.831405 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.844027 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.861176 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6x98g\" (UID: \"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:28 crc kubenswrapper[4777]: I0216 21:39:28.916707 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.180949 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:29 crc kubenswrapper[4777]: E0216 21:39:29.181153 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.488624 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 20:59:09.118385169 +0000 UTC Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.488808 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.503106 4777 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.855930 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" event={"ID":"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe","Type":"ContainerStarted","Data":"37fa99d0eef2f90103c0ade1fb934e884698f2909e96d161eefa5dbc8b41f8db"} Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.856010 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" event={"ID":"3b7133e0-9781-4d6f-a6a6-2fdc08cbf5fe","Type":"ContainerStarted","Data":"2dfdf9f4df3747cdac7c3809ec5810c77494a92657f65e98c5885cd78ba0bd03"} Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.879204 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.879172625 podStartE2EDuration="5.879172625s" podCreationTimestamp="2026-02-16 21:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:28.641216604 +0000 UTC m=+89.223717736" watchObservedRunningTime="2026-02-16 21:39:29.879172625 +0000 UTC m=+90.461673757" Feb 16 21:39:29 crc kubenswrapper[4777]: I0216 21:39:29.880452 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6x98g" podStartSLOduration=68.880440172 podStartE2EDuration="1m8.880440172s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:29.880140164 +0000 UTC m=+90.462641296" watchObservedRunningTime="2026-02-16 21:39:29.880440172 +0000 UTC m=+90.462941314" Feb 16 21:39:30 crc kubenswrapper[4777]: I0216 21:39:30.182104 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:30 crc kubenswrapper[4777]: I0216 21:39:30.182149 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:30 crc kubenswrapper[4777]: I0216 21:39:30.182043 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:30 crc kubenswrapper[4777]: E0216 21:39:30.182661 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:30 crc kubenswrapper[4777]: E0216 21:39:30.183077 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:30 crc kubenswrapper[4777]: E0216 21:39:30.183171 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:31 crc kubenswrapper[4777]: I0216 21:39:31.180935 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:31 crc kubenswrapper[4777]: E0216 21:39:31.181433 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:32 crc kubenswrapper[4777]: I0216 21:39:32.181610 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:32 crc kubenswrapper[4777]: I0216 21:39:32.181747 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:32 crc kubenswrapper[4777]: E0216 21:39:32.182850 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:32 crc kubenswrapper[4777]: E0216 21:39:32.182876 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:32 crc kubenswrapper[4777]: I0216 21:39:32.181788 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:32 crc kubenswrapper[4777]: E0216 21:39:32.183410 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:33 crc kubenswrapper[4777]: I0216 21:39:33.181515 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:33 crc kubenswrapper[4777]: E0216 21:39:33.181750 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:34 crc kubenswrapper[4777]: I0216 21:39:34.181558 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:34 crc kubenswrapper[4777]: I0216 21:39:34.181639 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:34 crc kubenswrapper[4777]: I0216 21:39:34.181667 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:34 crc kubenswrapper[4777]: E0216 21:39:34.182105 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:34 crc kubenswrapper[4777]: E0216 21:39:34.182228 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:34 crc kubenswrapper[4777]: E0216 21:39:34.182342 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:35 crc kubenswrapper[4777]: I0216 21:39:35.180916 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:35 crc kubenswrapper[4777]: E0216 21:39:35.181121 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:35 crc kubenswrapper[4777]: I0216 21:39:35.182847 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:39:35 crc kubenswrapper[4777]: E0216 21:39:35.183246 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:39:36 crc kubenswrapper[4777]: I0216 21:39:36.181761 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:36 crc kubenswrapper[4777]: I0216 21:39:36.181832 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:36 crc kubenswrapper[4777]: I0216 21:39:36.181781 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:36 crc kubenswrapper[4777]: E0216 21:39:36.182005 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:36 crc kubenswrapper[4777]: E0216 21:39:36.182242 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:36 crc kubenswrapper[4777]: E0216 21:39:36.182490 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:37 crc kubenswrapper[4777]: I0216 21:39:37.181388 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:37 crc kubenswrapper[4777]: E0216 21:39:37.181649 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:38 crc kubenswrapper[4777]: I0216 21:39:38.181556 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:38 crc kubenswrapper[4777]: I0216 21:39:38.181584 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:38 crc kubenswrapper[4777]: I0216 21:39:38.181659 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:38 crc kubenswrapper[4777]: E0216 21:39:38.181795 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:38 crc kubenswrapper[4777]: E0216 21:39:38.181976 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:38 crc kubenswrapper[4777]: E0216 21:39:38.182145 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:39 crc kubenswrapper[4777]: I0216 21:39:39.181195 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:39 crc kubenswrapper[4777]: E0216 21:39:39.181439 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:40 crc kubenswrapper[4777]: I0216 21:39:40.181183 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:40 crc kubenswrapper[4777]: I0216 21:39:40.187563 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:40 crc kubenswrapper[4777]: I0216 21:39:40.187619 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:40 crc kubenswrapper[4777]: E0216 21:39:40.187830 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:40 crc kubenswrapper[4777]: E0216 21:39:40.187988 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:40 crc kubenswrapper[4777]: E0216 21:39:40.188910 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:40 crc kubenswrapper[4777]: I0216 21:39:40.571922 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:40 crc kubenswrapper[4777]: E0216 21:39:40.572182 4777 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:39:40 crc kubenswrapper[4777]: E0216 21:39:40.572332 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs podName:a1d5abcd-6e58-4563-98c6-0adb808ed0a7 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:44.572298066 +0000 UTC m=+165.154799198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs") pod "network-metrics-daemon-rwm84" (UID: "a1d5abcd-6e58-4563-98c6-0adb808ed0a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 16 21:39:41 crc kubenswrapper[4777]: I0216 21:39:41.180922 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:41 crc kubenswrapper[4777]: E0216 21:39:41.181062 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:42 crc kubenswrapper[4777]: I0216 21:39:42.181125 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:42 crc kubenswrapper[4777]: I0216 21:39:42.181218 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:42 crc kubenswrapper[4777]: E0216 21:39:42.181342 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:42 crc kubenswrapper[4777]: E0216 21:39:42.181532 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:42 crc kubenswrapper[4777]: I0216 21:39:42.181616 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:42 crc kubenswrapper[4777]: E0216 21:39:42.181705 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:43 crc kubenswrapper[4777]: I0216 21:39:43.181398 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:43 crc kubenswrapper[4777]: E0216 21:39:43.181597 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:44 crc kubenswrapper[4777]: I0216 21:39:44.181215 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:44 crc kubenswrapper[4777]: I0216 21:39:44.181328 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:44 crc kubenswrapper[4777]: E0216 21:39:44.181445 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:44 crc kubenswrapper[4777]: E0216 21:39:44.181844 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:44 crc kubenswrapper[4777]: I0216 21:39:44.182272 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:44 crc kubenswrapper[4777]: E0216 21:39:44.182468 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:45 crc kubenswrapper[4777]: I0216 21:39:45.181631 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:45 crc kubenswrapper[4777]: E0216 21:39:45.183204 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:46 crc kubenswrapper[4777]: I0216 21:39:46.181240 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:46 crc kubenswrapper[4777]: I0216 21:39:46.181293 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:46 crc kubenswrapper[4777]: I0216 21:39:46.181260 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:46 crc kubenswrapper[4777]: E0216 21:39:46.181428 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:46 crc kubenswrapper[4777]: E0216 21:39:46.181537 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:46 crc kubenswrapper[4777]: E0216 21:39:46.181757 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:47 crc kubenswrapper[4777]: I0216 21:39:47.181963 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:47 crc kubenswrapper[4777]: E0216 21:39:47.183476 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:47 crc kubenswrapper[4777]: I0216 21:39:47.183812 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:39:47 crc kubenswrapper[4777]: E0216 21:39:47.184038 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:39:48 crc kubenswrapper[4777]: I0216 21:39:48.180786 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:48 crc kubenswrapper[4777]: E0216 21:39:48.180978 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:48 crc kubenswrapper[4777]: I0216 21:39:48.181083 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:48 crc kubenswrapper[4777]: E0216 21:39:48.181289 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:48 crc kubenswrapper[4777]: I0216 21:39:48.181739 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:48 crc kubenswrapper[4777]: E0216 21:39:48.182104 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:48 crc kubenswrapper[4777]: I0216 21:39:48.204680 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 16 21:39:49 crc kubenswrapper[4777]: I0216 21:39:49.180816 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:49 crc kubenswrapper[4777]: E0216 21:39:49.181026 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:50 crc kubenswrapper[4777]: I0216 21:39:50.180887 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:50 crc kubenswrapper[4777]: I0216 21:39:50.180974 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:50 crc kubenswrapper[4777]: E0216 21:39:50.184973 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:50 crc kubenswrapper[4777]: I0216 21:39:50.185005 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:50 crc kubenswrapper[4777]: E0216 21:39:50.185135 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:50 crc kubenswrapper[4777]: E0216 21:39:50.185405 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:50 crc kubenswrapper[4777]: I0216 21:39:50.221542 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.221509525 podStartE2EDuration="2.221509525s" podCreationTimestamp="2026-02-16 21:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:39:50.220073712 +0000 UTC m=+110.802574854" watchObservedRunningTime="2026-02-16 21:39:50.221509525 +0000 UTC m=+110.804010667" Feb 16 21:39:51 crc kubenswrapper[4777]: I0216 21:39:51.181043 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:51 crc kubenswrapper[4777]: E0216 21:39:51.181186 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:52 crc kubenswrapper[4777]: I0216 21:39:52.181707 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:52 crc kubenswrapper[4777]: I0216 21:39:52.181823 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:52 crc kubenswrapper[4777]: I0216 21:39:52.181866 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:52 crc kubenswrapper[4777]: E0216 21:39:52.181994 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:52 crc kubenswrapper[4777]: E0216 21:39:52.182108 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:52 crc kubenswrapper[4777]: E0216 21:39:52.182285 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:53 crc kubenswrapper[4777]: I0216 21:39:53.181582 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:53 crc kubenswrapper[4777]: E0216 21:39:53.181886 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:54 crc kubenswrapper[4777]: I0216 21:39:54.181050 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:54 crc kubenswrapper[4777]: I0216 21:39:54.181148 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:54 crc kubenswrapper[4777]: I0216 21:39:54.181250 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:54 crc kubenswrapper[4777]: E0216 21:39:54.181479 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:54 crc kubenswrapper[4777]: E0216 21:39:54.181665 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:54 crc kubenswrapper[4777]: E0216 21:39:54.181882 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.181456 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:55 crc kubenswrapper[4777]: E0216 21:39:55.181817 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.966994 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/1.log" Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.968240 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/0.log" Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.968334 4777 generic.go:334] "Generic (PLEG): container finished" podID="71656da7-4f33-419d-aaba-93bf9158f706" containerID="0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7" exitCode=1 Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.968391 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerDied","Data":"0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7"} Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.968455 4777 scope.go:117] "RemoveContainer" containerID="9b5b9fe91b12307b3028ed770e8688cf9c90b319d1b89da10b112430981a59de" Feb 16 21:39:55 crc kubenswrapper[4777]: I0216 21:39:55.969662 4777 scope.go:117] "RemoveContainer" containerID="0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7" Feb 16 21:39:55 crc kubenswrapper[4777]: E0216 21:39:55.970243 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vpf28_openshift-multus(71656da7-4f33-419d-aaba-93bf9158f706)\"" pod="openshift-multus/multus-vpf28" podUID="71656da7-4f33-419d-aaba-93bf9158f706" Feb 16 21:39:56 crc kubenswrapper[4777]: I0216 21:39:56.181262 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:56 crc kubenswrapper[4777]: I0216 21:39:56.181702 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:56 crc kubenswrapper[4777]: E0216 21:39:56.181818 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:56 crc kubenswrapper[4777]: I0216 21:39:56.181357 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:56 crc kubenswrapper[4777]: E0216 21:39:56.182088 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:56 crc kubenswrapper[4777]: E0216 21:39:56.182318 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:56 crc kubenswrapper[4777]: I0216 21:39:56.978089 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/1.log" Feb 16 21:39:57 crc kubenswrapper[4777]: I0216 21:39:57.181552 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:57 crc kubenswrapper[4777]: E0216 21:39:57.181804 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:39:58 crc kubenswrapper[4777]: I0216 21:39:58.186840 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:39:58 crc kubenswrapper[4777]: E0216 21:39:58.187033 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:39:58 crc kubenswrapper[4777]: I0216 21:39:58.186845 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:39:58 crc kubenswrapper[4777]: I0216 21:39:58.187117 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:39:58 crc kubenswrapper[4777]: E0216 21:39:58.187203 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:39:58 crc kubenswrapper[4777]: E0216 21:39:58.187392 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:39:59 crc kubenswrapper[4777]: I0216 21:39:59.180754 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:39:59 crc kubenswrapper[4777]: E0216 21:39:59.180916 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:00 crc kubenswrapper[4777]: I0216 21:40:00.181642 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:00 crc kubenswrapper[4777]: E0216 21:40:00.182877 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:00 crc kubenswrapper[4777]: I0216 21:40:00.183202 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:00 crc kubenswrapper[4777]: I0216 21:40:00.183283 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:00 crc kubenswrapper[4777]: E0216 21:40:00.183436 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:00 crc kubenswrapper[4777]: E0216 21:40:00.183796 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:00 crc kubenswrapper[4777]: E0216 21:40:00.192791 4777 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 16 21:40:00 crc kubenswrapper[4777]: E0216 21:40:00.301088 4777 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:40:01 crc kubenswrapper[4777]: I0216 21:40:01.181348 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:01 crc kubenswrapper[4777]: E0216 21:40:01.182169 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:01 crc kubenswrapper[4777]: I0216 21:40:01.182913 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:40:01 crc kubenswrapper[4777]: E0216 21:40:01.183195 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w27qk_openshift-ovn-kubernetes(a3c293d7-2d38-4047-a104-7f354aebf216)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" Feb 16 21:40:02 crc kubenswrapper[4777]: I0216 21:40:02.181746 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:02 crc kubenswrapper[4777]: I0216 21:40:02.182019 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:02 crc kubenswrapper[4777]: E0216 21:40:02.182220 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:02 crc kubenswrapper[4777]: I0216 21:40:02.182255 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:02 crc kubenswrapper[4777]: E0216 21:40:02.182630 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:02 crc kubenswrapper[4777]: E0216 21:40:02.182877 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:03 crc kubenswrapper[4777]: I0216 21:40:03.181848 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:03 crc kubenswrapper[4777]: E0216 21:40:03.182209 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:04 crc kubenswrapper[4777]: I0216 21:40:04.181396 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:04 crc kubenswrapper[4777]: I0216 21:40:04.181409 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:04 crc kubenswrapper[4777]: E0216 21:40:04.181641 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:04 crc kubenswrapper[4777]: I0216 21:40:04.181752 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:04 crc kubenswrapper[4777]: E0216 21:40:04.181900 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:04 crc kubenswrapper[4777]: E0216 21:40:04.181984 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:05 crc kubenswrapper[4777]: I0216 21:40:05.181572 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:05 crc kubenswrapper[4777]: E0216 21:40:05.181830 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:05 crc kubenswrapper[4777]: E0216 21:40:05.303257 4777 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:40:06 crc kubenswrapper[4777]: I0216 21:40:06.181917 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:06 crc kubenswrapper[4777]: I0216 21:40:06.182003 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:06 crc kubenswrapper[4777]: I0216 21:40:06.182058 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:06 crc kubenswrapper[4777]: E0216 21:40:06.182231 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:06 crc kubenswrapper[4777]: E0216 21:40:06.182339 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:06 crc kubenswrapper[4777]: E0216 21:40:06.182471 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:07 crc kubenswrapper[4777]: I0216 21:40:07.181347 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:07 crc kubenswrapper[4777]: E0216 21:40:07.181569 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:08 crc kubenswrapper[4777]: I0216 21:40:08.181327 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:08 crc kubenswrapper[4777]: I0216 21:40:08.181395 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:08 crc kubenswrapper[4777]: E0216 21:40:08.181545 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:08 crc kubenswrapper[4777]: I0216 21:40:08.181643 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:08 crc kubenswrapper[4777]: E0216 21:40:08.181787 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:08 crc kubenswrapper[4777]: E0216 21:40:08.181994 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:09 crc kubenswrapper[4777]: I0216 21:40:09.181393 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:09 crc kubenswrapper[4777]: E0216 21:40:09.181584 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:10 crc kubenswrapper[4777]: I0216 21:40:10.182025 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:10 crc kubenswrapper[4777]: I0216 21:40:10.182117 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:10 crc kubenswrapper[4777]: I0216 21:40:10.182167 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:10 crc kubenswrapper[4777]: E0216 21:40:10.185107 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:10 crc kubenswrapper[4777]: E0216 21:40:10.185253 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:10 crc kubenswrapper[4777]: E0216 21:40:10.185342 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:10 crc kubenswrapper[4777]: E0216 21:40:10.304086 4777 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:40:11 crc kubenswrapper[4777]: I0216 21:40:11.181808 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:11 crc kubenswrapper[4777]: E0216 21:40:11.182039 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:11 crc kubenswrapper[4777]: I0216 21:40:11.182531 4777 scope.go:117] "RemoveContainer" containerID="0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7" Feb 16 21:40:12 crc kubenswrapper[4777]: I0216 21:40:12.052147 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/1.log" Feb 16 21:40:12 crc kubenswrapper[4777]: I0216 21:40:12.052607 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerStarted","Data":"8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579"} Feb 16 21:40:12 crc kubenswrapper[4777]: I0216 21:40:12.182176 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:12 crc kubenswrapper[4777]: E0216 21:40:12.182380 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:12 crc kubenswrapper[4777]: I0216 21:40:12.182688 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:12 crc kubenswrapper[4777]: E0216 21:40:12.182832 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:12 crc kubenswrapper[4777]: I0216 21:40:12.183119 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:12 crc kubenswrapper[4777]: E0216 21:40:12.183245 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:13 crc kubenswrapper[4777]: I0216 21:40:13.181511 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:13 crc kubenswrapper[4777]: E0216 21:40:13.181763 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:14 crc kubenswrapper[4777]: I0216 21:40:14.181673 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:14 crc kubenswrapper[4777]: I0216 21:40:14.181831 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:14 crc kubenswrapper[4777]: I0216 21:40:14.181673 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:14 crc kubenswrapper[4777]: E0216 21:40:14.181936 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:14 crc kubenswrapper[4777]: E0216 21:40:14.182057 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:14 crc kubenswrapper[4777]: E0216 21:40:14.182192 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:15 crc kubenswrapper[4777]: I0216 21:40:15.181029 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:15 crc kubenswrapper[4777]: E0216 21:40:15.181309 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:15 crc kubenswrapper[4777]: E0216 21:40:15.305522 4777 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:40:16 crc kubenswrapper[4777]: I0216 21:40:16.181912 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:16 crc kubenswrapper[4777]: E0216 21:40:16.182167 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:16 crc kubenswrapper[4777]: I0216 21:40:16.182544 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:16 crc kubenswrapper[4777]: E0216 21:40:16.182645 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:16 crc kubenswrapper[4777]: I0216 21:40:16.181950 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:16 crc kubenswrapper[4777]: E0216 21:40:16.183027 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:16 crc kubenswrapper[4777]: I0216 21:40:16.183457 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.075620 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/3.log" Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.079354 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerStarted","Data":"9495662231ebaa035e6874d5dcb56c062e94dc2af111acff6f4be7588f548456"} Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.080140 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.181761 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:17 crc kubenswrapper[4777]: E0216 21:40:17.181998 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.346498 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podStartSLOduration=116.346460245 podStartE2EDuration="1m56.346460245s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:17.125518088 +0000 UTC m=+137.708019230" watchObservedRunningTime="2026-02-16 21:40:17.346460245 +0000 UTC m=+137.928961387" Feb 16 21:40:17 crc kubenswrapper[4777]: I0216 21:40:17.348615 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwm84"] Feb 16 21:40:18 crc kubenswrapper[4777]: I0216 21:40:18.083909 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:18 crc kubenswrapper[4777]: E0216 21:40:18.084583 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:18 crc kubenswrapper[4777]: I0216 21:40:18.181739 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:18 crc kubenswrapper[4777]: I0216 21:40:18.181841 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:18 crc kubenswrapper[4777]: I0216 21:40:18.181890 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:18 crc kubenswrapper[4777]: E0216 21:40:18.181891 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:18 crc kubenswrapper[4777]: E0216 21:40:18.182119 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:18 crc kubenswrapper[4777]: E0216 21:40:18.182504 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:20 crc kubenswrapper[4777]: I0216 21:40:20.181504 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:20 crc kubenswrapper[4777]: I0216 21:40:20.181577 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:20 crc kubenswrapper[4777]: I0216 21:40:20.181493 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:20 crc kubenswrapper[4777]: E0216 21:40:20.181838 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 16 21:40:20 crc kubenswrapper[4777]: E0216 21:40:20.181952 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 16 21:40:20 crc kubenswrapper[4777]: E0216 21:40:20.182078 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwm84" podUID="a1d5abcd-6e58-4563-98c6-0adb808ed0a7" Feb 16 21:40:20 crc kubenswrapper[4777]: I0216 21:40:20.182630 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:20 crc kubenswrapper[4777]: E0216 21:40:20.183110 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.181576 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.181744 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.181582 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.181941 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.187757 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.187840 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.188257 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.188506 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.191694 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 16 21:40:22 crc kubenswrapper[4777]: I0216 21:40:22.192010 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.477038 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:28 crc kubenswrapper[4777]: E0216 21:40:28.477260 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:42:30.477222725 +0000 UTC m=+271.059723817 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.477509 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.477576 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.477617 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.477684 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.479810 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.487234 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.490155 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.490294 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.508514 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.522660 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 16 21:40:28 crc kubenswrapper[4777]: I0216 21:40:28.535997 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:28 crc kubenswrapper[4777]: W0216 21:40:28.815434 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-47eabf335707ebe71873d3e0bfc92ace75a07ec8f92e22b083c7808db91a85e7 WatchSource:0}: Error finding container 47eabf335707ebe71873d3e0bfc92ace75a07ec8f92e22b083c7808db91a85e7: Status 404 returned error can't find the container with id 47eabf335707ebe71873d3e0bfc92ace75a07ec8f92e22b083c7808db91a85e7 Feb 16 21:40:28 crc kubenswrapper[4777]: W0216 21:40:28.820054 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4447473e7a3cbf8c1426bada6ecfd3aa6d51389763e0b0415101c243b16b7c48 WatchSource:0}: Error finding container 4447473e7a3cbf8c1426bada6ecfd3aa6d51389763e0b0415101c243b16b7c48: Status 404 returned error can't find the container with id 4447473e7a3cbf8c1426bada6ecfd3aa6d51389763e0b0415101c243b16b7c48 Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.128505 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2cd7f6b08330b0391957b423e35b61f6bbadc11e7016083385292d9e373b3209"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.128565 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"47eabf335707ebe71873d3e0bfc92ace75a07ec8f92e22b083c7808db91a85e7"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.131054 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"77fd346ddebc81b66a88a03b8d27854ce378752ba531a34f45138d1ed04445c3"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.131107 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"97915995ae649f54db98a0d8d593d2ca5a33404dd928a3510d5e50df574a119a"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.134311 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd4dbd77ccc8c949dc2738683b25815f45965d24633a6433995cf7aaf1a342d6"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.134361 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4447473e7a3cbf8c1426bada6ecfd3aa6d51389763e0b0415101c243b16b7c48"} Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.134521 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.782443 4777 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.841895 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.842793 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.847782 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.848458 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.851372 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.852575 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.852648 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.853293 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.853504 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.853505 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.853700 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.854661 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.854961 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jfdb"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.855546 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.856817 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fb8q7"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.857435 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.858707 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.859554 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.860211 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.862940 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.863425 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.863631 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.863648 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.863823 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.864006 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.864154 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.864268 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.864302 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.864607 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.865760 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.866543 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.866572 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.886801 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.888833 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.914184 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.914588 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.914847 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.914976 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.914971 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.915626 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.916084 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.916571 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.917323 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.917487 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.918927 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.919538 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.919755 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.919860 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.920134 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.920286 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.920461 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kgdkp"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.921428 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.921697 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.922512 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.925425 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-phc5s"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.926091 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5pdvt"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.926705 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.926955 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.927059 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.927154 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.928062 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5cl6"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.928276 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.929957 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.930187 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.930405 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-l29zb"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.930663 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.932215 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.926813 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.928792 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.932951 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.933277 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.933902 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.929378 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.933323 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.934538 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.934841 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.935367 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.934080 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.934104 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.934138 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.935971 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936095 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936189 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936260 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936337 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936411 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936111 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936150 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.941842 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87dv2"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936558 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936763 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936884 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.948943 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.949220 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.936914 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937012 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937047 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937072 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.949596 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937074 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.949661 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.949982 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.950270 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.950291 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937099 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.958816 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.937757 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.938006 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.938077 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.967263 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.968488 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.968674 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.969417 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.970611 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.975972 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.977796 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980521 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnrgl"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980172 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980264 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980351 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980418 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980520 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980554 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980567 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980588 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980619 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980636 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.980751 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.988957 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989144 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989243 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989368 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989456 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989606 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989665 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989824 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989877 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989924 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.989941 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.990023 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.990569 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.992251 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.993483 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.994379 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.994433 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.995474 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.995596 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z"] Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.995893 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.996250 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.996402 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" Feb 16 21:40:29 crc kubenswrapper[4777]: I0216 21:40:29.996606 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.001442 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.004856 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.005197 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.005651 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.005894 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.006140 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.006546 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.008861 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.009503 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bcpwn"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.009735 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.010567 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.012835 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.013302 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.013572 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.015750 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.015989 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.016096 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.016846 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.017029 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lzg58"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.017605 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021250 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ef6550-85de-4b77-b522-865f342fdd21-config\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021286 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021303 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021325 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twvx7\" (UniqueName: \"kubernetes.io/projected/f80ba78f-7361-4793-911d-c2db9761916e-kube-api-access-twvx7\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021346 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2sk\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-kube-api-access-6q2sk\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021465 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80ba78f-7361-4793-911d-c2db9761916e-serving-cert\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021508 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdk6b\" (UniqueName: \"kubernetes.io/projected/027a50f3-fb4e-491e-b410-9d2ed7b4b836-kube-api-access-sdk6b\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021548 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-encryption-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021567 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5615afd6-9159-4ccc-b08d-4305b9b792bb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021619 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021642 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021665 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021681 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pt9q\" (UniqueName: \"kubernetes.io/projected/21b9331b-c0a6-4a19-a056-93eb683693df-kube-api-access-9pt9q\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021696 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-config\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021750 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ef6550-85de-4b77-b522-865f342fdd21-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021770 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021787 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021803 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021822 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bjg\" (UniqueName: \"kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021839 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-client\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021852 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-serving-cert\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021866 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021881 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrht6\" (UniqueName: \"kubernetes.io/projected/1457b00a-86eb-457e-99a3-bf4bb271c513-kube-api-access-nrht6\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021899 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwgrt\" (UniqueName: \"kubernetes.io/projected/98ef6550-85de-4b77-b522-865f342fdd21-kube-api-access-lwgrt\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021917 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-auth-proxy-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021940 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021955 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlnv4\" (UniqueName: \"kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021974 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2sr5\" (UniqueName: \"kubernetes.io/projected/3153ee9e-fa94-4bb7-8506-16778984382b-kube-api-access-v2sr5\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.021992 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-service-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022010 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022035 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qr9\" (UniqueName: \"kubernetes.io/projected/29e22b2d-4f29-4258-8b81-31adeae29a3c-kube-api-access-p7qr9\") pod \"downloads-7954f5f757-phc5s\" (UID: \"29e22b2d-4f29-4258-8b81-31adeae29a3c\") " pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022053 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-node-pullsecrets\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022069 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-serving-cert\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022086 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022102 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022119 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-serving-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022143 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022159 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/027a50f3-fb4e-491e-b410-9d2ed7b4b836-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022174 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgmg\" (UniqueName: \"kubernetes.io/projected/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-kube-api-access-mqgmg\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022190 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54hq\" (UniqueName: \"kubernetes.io/projected/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-kube-api-access-j54hq\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022204 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022223 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27r2r\" (UniqueName: \"kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022243 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022261 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-default-certificate\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022276 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-encryption-config\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022292 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8033eae2-307c-42a9-b2b0-cac401f3add8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022310 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022323 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-config\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022343 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-config\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022361 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c3d4c45-0af9-4160-9c7f-db96f50e2210-machine-approver-tls\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022379 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8033eae2-307c-42a9-b2b0-cac401f3add8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022392 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit-dir\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022406 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022422 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-metrics-certs\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022439 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcrw\" (UniqueName: \"kubernetes.io/projected/5615afd6-9159-4ccc-b08d-4305b9b792bb-kube-api-access-fkcrw\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022457 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022471 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.022491 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.029017 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.029106 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-stats-auth\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030732 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3153ee9e-fa94-4bb7-8506-16778984382b-audit-dir\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030770 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030789 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030809 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030844 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030866 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030890 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030930 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4b8a4b-737b-411b-a707-fd1e086b685e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030951 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030970 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw76z\" (UniqueName: \"kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.030990 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4b8a4b-737b-411b-a707-fd1e086b685e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.031009 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-trusted-ca\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.031029 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2cqw\" (UniqueName: \"kubernetes.io/projected/5c3d4c45-0af9-4160-9c7f-db96f50e2210-kube-api-access-r2cqw\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.031050 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-image-import-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.031070 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.033226 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.033773 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm57m\" (UniqueName: \"kubernetes.io/projected/fc4b8a4b-737b-411b-a707-fd1e086b685e-kube-api-access-jm57m\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.033817 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/21b9331b-c0a6-4a19-a056-93eb683693df-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.033859 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-serving-cert\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034074 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034115 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034143 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034190 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ggw\" (UniqueName: \"kubernetes.io/projected/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-kube-api-access-p6ggw\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034315 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-audit-policies\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034527 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034640 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034687 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-images\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034724 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-client\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034762 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1457b00a-86eb-457e-99a3-bf4bb271c513-service-ca-bundle\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.034842 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.042890 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.045214 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.045264 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.051007 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.052735 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.053043 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.055183 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.055347 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.056959 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kpqz8"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.060886 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.064580 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.065604 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.066146 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fb8q7"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.066751 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.066841 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.067675 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.072200 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jfdb"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.073365 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.074793 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5cl6"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.075580 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kgdkp"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.076830 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.077618 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.078620 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.085619 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.086235 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.094532 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.096569 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.098243 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.099085 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-phc5s"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.100132 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9skvx"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.102024 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.102181 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.102354 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.103108 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5pdvt"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.104197 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.105386 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bcpwn"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.105409 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.106114 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.108487 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.109201 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87dv2"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.110798 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnrgl"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.111320 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.113860 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.113894 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.113906 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.115249 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.116182 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.117203 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.118040 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.118998 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.120327 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.121000 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.122156 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzg58"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.123101 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.124398 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kpqz8"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.124881 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.125045 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kr6qs"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.125918 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.126358 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j4h22"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.129232 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kr6qs"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.129329 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.130560 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j4h22"] Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.136063 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.138896 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4b8a4b-737b-411b-a707-fd1e086b685e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.138982 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bw5\" (UniqueName: \"kubernetes.io/projected/263c0239-456c-4076-ae02-eb2a4abc33e4-kube-api-access-96bw5\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139057 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139089 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139113 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw76z\" (UniqueName: \"kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139143 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2cqw\" (UniqueName: \"kubernetes.io/projected/5c3d4c45-0af9-4160-9c7f-db96f50e2210-kube-api-access-r2cqw\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139198 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139233 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm57m\" (UniqueName: \"kubernetes.io/projected/fc4b8a4b-737b-411b-a707-fd1e086b685e-kube-api-access-jm57m\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139263 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/21b9331b-c0a6-4a19-a056-93eb683693df-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139298 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139332 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139364 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b303b57a-e6e8-40da-8fa9-56e11e1a948b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139401 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-audit-policies\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139435 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twvx7\" (UniqueName: \"kubernetes.io/projected/f80ba78f-7361-4793-911d-c2db9761916e-kube-api-access-twvx7\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139471 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-images\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139499 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139528 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139735 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80ba78f-7361-4793-911d-c2db9761916e-serving-cert\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139782 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdk6b\" (UniqueName: \"kubernetes.io/projected/027a50f3-fb4e-491e-b410-9d2ed7b4b836-kube-api-access-sdk6b\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139820 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-encryption-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139857 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139886 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139930 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efb9114f-122a-4b76-8763-4dd9dd908707-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.139975 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pt9q\" (UniqueName: \"kubernetes.io/projected/21b9331b-c0a6-4a19-a056-93eb683693df-kube-api-access-9pt9q\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140041 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140085 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140130 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bjg\" (UniqueName: \"kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140217 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrht6\" (UniqueName: \"kubernetes.io/projected/1457b00a-86eb-457e-99a3-bf4bb271c513-kube-api-access-nrht6\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140266 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2sr5\" (UniqueName: \"kubernetes.io/projected/3153ee9e-fa94-4bb7-8506-16778984382b-kube-api-access-v2sr5\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140283 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140311 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.140872 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4b8a4b-737b-411b-a707-fd1e086b685e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.142826 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.143669 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-audit-policies\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.144235 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.144295 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.145024 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-encryption-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.145299 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.145757 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.146426 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f80ba78f-7361-4793-911d-c2db9761916e-serving-cert\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.146579 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.147844 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.147951 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-node-pullsecrets\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148096 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148140 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b303b57a-e6e8-40da-8fa9-56e11e1a948b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148184 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/027a50f3-fb4e-491e-b410-9d2ed7b4b836-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148219 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgmg\" (UniqueName: \"kubernetes.io/projected/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-kube-api-access-mqgmg\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148305 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54hq\" (UniqueName: \"kubernetes.io/projected/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-kube-api-access-j54hq\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148341 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148370 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27r2r\" (UniqueName: \"kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148402 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-encryption-config\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148426 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8033eae2-307c-42a9-b2b0-cac401f3add8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148427 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/21b9331b-c0a6-4a19-a056-93eb683693df-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148455 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-default-certificate\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148486 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-config\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148817 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.148860 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-node-pullsecrets\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149019 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c3d4c45-0af9-4160-9c7f-db96f50e2210-machine-approver-tls\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149066 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcrw\" (UniqueName: \"kubernetes.io/projected/5615afd6-9159-4ccc-b08d-4305b9b792bb-kube-api-access-fkcrw\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149099 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149118 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149152 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-stats-auth\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149177 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149200 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149216 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149241 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtcj\" (UniqueName: \"kubernetes.io/projected/efb9114f-122a-4b76-8763-4dd9dd908707-kube-api-access-6qtcj\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149264 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149285 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xksqm\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-kube-api-access-xksqm\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149306 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4b8a4b-737b-411b-a707-fd1e086b685e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149324 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-trusted-ca\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149344 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-image-import-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149379 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-config\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149405 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-serving-cert\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149426 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-kube-api-access-9wwv9\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149457 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-client\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149483 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149511 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ggw\" (UniqueName: \"kubernetes.io/projected/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-kube-api-access-p6ggw\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149533 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149559 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149581 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-images\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149601 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-client\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149621 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1457b00a-86eb-457e-99a3-bf4bb271c513-service-ca-bundle\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149641 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ef6550-85de-4b77-b522-865f342fdd21-config\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149661 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149682 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149704 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdzx\" (UniqueName: \"kubernetes.io/projected/da00f555-fb9c-4329-b263-5d1e3c31180e-kube-api-access-5sdzx\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149739 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2sk\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-kube-api-access-6q2sk\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149761 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149782 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wck4\" (UniqueName: \"kubernetes.io/projected/4af6ea3c-7da8-465d-b056-9b9f0a67f626-kube-api-access-8wck4\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149801 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/489bab1c-472b-4904-b4bb-3dc341f48f27-metrics-tls\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149819 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af6ea3c-7da8-465d-b056-9b9f0a67f626-proxy-tls\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149853 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5615afd6-9159-4ccc-b08d-4305b9b792bb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149872 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-serving-cert\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149888 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149912 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149948 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-config\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149969 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ef6550-85de-4b77-b522-865f342fdd21-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.149994 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150018 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150049 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwgrt\" (UniqueName: \"kubernetes.io/projected/98ef6550-85de-4b77-b522-865f342fdd21-kube-api-access-lwgrt\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150081 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-client\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150107 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-serving-cert\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150130 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150162 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489bab1c-472b-4904-b4bb-3dc341f48f27-trusted-ca\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150198 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150227 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlnv4\" (UniqueName: \"kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150256 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-auth-proxy-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150284 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-service-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150311 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b303b57a-e6e8-40da-8fa9-56e11e1a948b-config\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150332 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efb9114f-122a-4b76-8763-4dd9dd908707-proxy-tls\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150353 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150373 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qr9\" (UniqueName: \"kubernetes.io/projected/29e22b2d-4f29-4258-8b81-31adeae29a3c-kube-api-access-p7qr9\") pod \"downloads-7954f5f757-phc5s\" (UID: \"29e22b2d-4f29-4258-8b81-31adeae29a3c\") " pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150394 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-serving-cert\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150421 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-serving-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150447 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-service-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150468 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150489 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-metrics-tls\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150517 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150540 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150562 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-config\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150581 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8033eae2-307c-42a9-b2b0-cac401f3add8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150600 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit-dir\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150619 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150640 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-metrics-certs\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150658 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150680 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150699 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150732 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150755 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150764 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150786 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3153ee9e-fa94-4bb7-8506-16778984382b-audit-dir\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.150810 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.152314 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.153239 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154086 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-config\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154196 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-serving-cert\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154285 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154422 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154468 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.143203 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.154964 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-encryption-config\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.155334 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-config\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.155462 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5c3d4c45-0af9-4160-9c7f-db96f50e2210-machine-approver-tls\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.155477 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8033eae2-307c-42a9-b2b0-cac401f3add8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.155764 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/027a50f3-fb4e-491e-b410-9d2ed7b4b836-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.156129 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.156179 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.156631 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-images\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.156806 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.157074 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-default-certificate\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.158298 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1457b00a-86eb-457e-99a3-bf4bb271c513-service-ca-bundle\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.158738 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.159322 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-image-import-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.159482 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f80ba78f-7361-4793-911d-c2db9761916e-trusted-ca\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.159733 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-stats-auth\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.160387 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5615afd6-9159-4ccc-b08d-4305b9b792bb-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.160548 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4b8a4b-737b-411b-a707-fd1e086b685e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161073 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-service-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161549 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ef6550-85de-4b77-b522-865f342fdd21-config\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161638 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161560 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-client\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161728 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-audit-dir\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161881 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.161556 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ef6550-85de-4b77-b522-865f342fdd21-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.162023 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.162353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.162865 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-etcd-serving-ca\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.162805 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5c3d4c45-0af9-4160-9c7f-db96f50e2210-auth-proxy-config\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.163139 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.163554 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.163959 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5615afd6-9159-4ccc-b08d-4305b9b792bb-config\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.164029 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.164263 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.164469 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.165428 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-config\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.165463 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-serving-cert\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.165492 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.165815 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.164097 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3153ee9e-fa94-4bb7-8506-16778984382b-audit-dir\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.168214 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-serving-cert\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.168461 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-serving-cert\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.168569 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.168707 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.169016 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3153ee9e-fa94-4bb7-8506-16778984382b-etcd-client\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.169821 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.172912 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1457b00a-86eb-457e-99a3-bf4bb271c513-metrics-certs\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.175471 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.185504 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8033eae2-307c-42a9-b2b0-cac401f3add8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.205936 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.225203 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.244595 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251452 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtcj\" (UniqueName: \"kubernetes.io/projected/efb9114f-122a-4b76-8763-4dd9dd908707-kube-api-access-6qtcj\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251483 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xksqm\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-kube-api-access-xksqm\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251504 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-config\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251520 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-kube-api-access-9wwv9\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251534 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-client\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251556 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdzx\" (UniqueName: \"kubernetes.io/projected/da00f555-fb9c-4329-b263-5d1e3c31180e-kube-api-access-5sdzx\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251584 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251601 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/489bab1c-472b-4904-b4bb-3dc341f48f27-metrics-tls\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251616 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wck4\" (UniqueName: \"kubernetes.io/projected/4af6ea3c-7da8-465d-b056-9b9f0a67f626-kube-api-access-8wck4\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251632 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af6ea3c-7da8-465d-b056-9b9f0a67f626-proxy-tls\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251654 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-serving-cert\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251669 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251686 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251728 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489bab1c-472b-4904-b4bb-3dc341f48f27-trusted-ca\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251748 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b303b57a-e6e8-40da-8fa9-56e11e1a948b-config\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251767 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efb9114f-122a-4b76-8763-4dd9dd908707-proxy-tls\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-service-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251803 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-metrics-tls\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251824 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251843 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251859 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251878 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bw5\" (UniqueName: \"kubernetes.io/projected/263c0239-456c-4076-ae02-eb2a4abc33e4-kube-api-access-96bw5\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251911 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b303b57a-e6e8-40da-8fa9-56e11e1a948b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251941 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-images\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251956 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251972 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.251997 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efb9114f-122a-4b76-8763-4dd9dd908707-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.252071 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b303b57a-e6e8-40da-8fa9-56e11e1a948b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.253000 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b303b57a-e6e8-40da-8fa9-56e11e1a948b-config\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.253129 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.253689 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efb9114f-122a-4b76-8763-4dd9dd908707-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.264453 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.276028 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b303b57a-e6e8-40da-8fa9-56e11e1a948b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.285611 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.312529 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.324169 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/489bab1c-472b-4904-b4bb-3dc341f48f27-trusted-ca\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.326180 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.345200 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.358005 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-serving-cert\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.365872 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.377324 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-client\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.384426 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.393525 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-config\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.406202 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.414050 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.424847 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.433557 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/da00f555-fb9c-4329-b263-5d1e3c31180e-etcd-service-ca\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.445095 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.465670 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.484890 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.506188 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.518518 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/489bab1c-472b-4904-b4bb-3dc341f48f27-metrics-tls\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.525210 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.546347 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.565013 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.586163 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.606478 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.625896 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.646805 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.665151 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.687774 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.705835 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.726524 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.745585 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.765667 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.784940 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.798553 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-metrics-tls\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.812899 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.825760 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.845460 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.865091 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.886504 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.905014 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.925885 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.934412 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af6ea3c-7da8-465d-b056-9b9f0a67f626-images\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.946002 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.966151 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.984913 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 16 21:40:30 crc kubenswrapper[4777]: I0216 21:40:30.997542 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af6ea3c-7da8-465d-b056-9b9f0a67f626-proxy-tls\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.005317 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.024027 4777 request.go:700] Waited for 1.017868492s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.039685 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.053550 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.071215 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.085972 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.104828 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.105225 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/efb9114f-122a-4b76-8763-4dd9dd908707-proxy-tls\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.125077 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.153751 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.164985 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.186925 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.205837 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.224768 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.245211 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253022 4777 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253089 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config podName:6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a nodeName:}" failed. No retries permitted until 2026-02-16 21:40:31.75306786 +0000 UTC m=+152.335568962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" (UID: "6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a") : failed to sync configmap cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253147 4777 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253132 4777 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253214 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume podName:263c0239-456c-4076-ae02-eb2a4abc33e4 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:31.753192834 +0000 UTC m=+152.335693966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume") pod "dns-default-lzg58" (UID: "263c0239-456c-4076-ae02-eb2a4abc33e4") : failed to sync configmap cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253262 4777 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253327 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert podName:6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a nodeName:}" failed. No retries permitted until 2026-02-16 21:40:31.753278347 +0000 UTC m=+152.335779509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" (UID: "6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a") : failed to sync secret cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: E0216 21:40:31.253380 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls podName:263c0239-456c-4076-ae02-eb2a4abc33e4 nodeName:}" failed. No retries permitted until 2026-02-16 21:40:31.75335799 +0000 UTC m=+152.335859232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls") pod "dns-default-lzg58" (UID: "263c0239-456c-4076-ae02-eb2a4abc33e4") : failed to sync secret cache: timed out waiting for the condition Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.264918 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.285805 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.305380 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.325077 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.346288 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.365121 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.386706 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.406324 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.426363 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.445943 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.485994 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.506030 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.526558 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.547206 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.566444 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.585901 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.606295 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.626175 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.646645 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.666894 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.686454 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.706523 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.726276 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.746547 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.766470 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.777033 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.777102 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.777221 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.777398 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.778540 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.778764 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/263c0239-456c-4076-ae02-eb2a4abc33e4-config-volume\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.782178 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/263c0239-456c-4076-ae02-eb2a4abc33e4-metrics-tls\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.785211 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.791675 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.806767 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.826201 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.846876 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.865847 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.886432 4777 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.905689 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.953821 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw76z\" (UniqueName: \"kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z\") pod \"oauth-openshift-558db77b4-5pdvt\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:31 crc kubenswrapper[4777]: I0216 21:40:31.969970 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2cqw\" (UniqueName: \"kubernetes.io/projected/5c3d4c45-0af9-4160-9c7f-db96f50e2210-kube-api-access-r2cqw\") pod \"machine-approver-56656f9798-4kb9f\" (UID: \"5c3d4c45-0af9-4160-9c7f-db96f50e2210\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.001389 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm57m\" (UniqueName: \"kubernetes.io/projected/fc4b8a4b-737b-411b-a707-fd1e086b685e-kube-api-access-jm57m\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tcrg\" (UID: \"fc4b8a4b-737b-411b-a707-fd1e086b685e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.019875 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twvx7\" (UniqueName: \"kubernetes.io/projected/f80ba78f-7361-4793-911d-c2db9761916e-kube-api-access-twvx7\") pod \"console-operator-58897d9998-8jfdb\" (UID: \"f80ba78f-7361-4793-911d-c2db9761916e\") " pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.024181 4777 request.go:700] Waited for 1.879281217s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.031058 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdk6b\" (UniqueName: \"kubernetes.io/projected/027a50f3-fb4e-491e-b410-9d2ed7b4b836-kube-api-access-sdk6b\") pod \"cluster-samples-operator-665b6dd947-n6q9g\" (UID: \"027a50f3-fb4e-491e-b410-9d2ed7b4b836\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.051632 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.053639 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bjg\" (UniqueName: \"kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg\") pod \"controller-manager-879f6c89f-sx74s\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.073041 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrht6\" (UniqueName: \"kubernetes.io/projected/1457b00a-86eb-457e-99a3-bf4bb271c513-kube-api-access-nrht6\") pod \"router-default-5444994796-l29zb\" (UID: \"1457b00a-86eb-457e-99a3-bf4bb271c513\") " pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:32 crc kubenswrapper[4777]: W0216 21:40:32.079932 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3d4c45_0af9_4160_9c7f_db96f50e2210.slice/crio-418fbdb17872db490e5231ff3844fb5c86626ee5980f9feecbdb99111bac5499 WatchSource:0}: Error finding container 418fbdb17872db490e5231ff3844fb5c86626ee5980f9feecbdb99111bac5499: Status 404 returned error can't find the container with id 418fbdb17872db490e5231ff3844fb5c86626ee5980f9feecbdb99111bac5499 Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.090731 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2sr5\" (UniqueName: \"kubernetes.io/projected/3153ee9e-fa94-4bb7-8506-16778984382b-kube-api-access-v2sr5\") pod \"apiserver-7bbb656c7d-9wjm2\" (UID: \"3153ee9e-fa94-4bb7-8506-16778984382b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.100412 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27r2r\" (UniqueName: \"kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r\") pod \"console-f9d7485db-rxnqn\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.111323 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.134948 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgmg\" (UniqueName: \"kubernetes.io/projected/c1ddbbc5-95a5-4f4f-8a1d-0442229c0928-kube-api-access-mqgmg\") pod \"authentication-operator-69f744f599-fb8q7\" (UID: \"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.136339 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.148364 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54hq\" (UniqueName: \"kubernetes.io/projected/8b1dfd22-6fa0-4096-97e4-871e6b65ad30-kube-api-access-j54hq\") pod \"openshift-config-operator-7777fb866f-6wbqg\" (UID: \"8b1dfd22-6fa0-4096-97e4-871e6b65ad30\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.148787 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.162317 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.165990 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.175946 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.189361 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qr9\" (UniqueName: \"kubernetes.io/projected/29e22b2d-4f29-4258-8b81-31adeae29a3c-kube-api-access-p7qr9\") pod \"downloads-7954f5f757-phc5s\" (UID: \"29e22b2d-4f29-4258-8b81-31adeae29a3c\") " pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.191032 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcrw\" (UniqueName: \"kubernetes.io/projected/5615afd6-9159-4ccc-b08d-4305b9b792bb-kube-api-access-fkcrw\") pod \"machine-api-operator-5694c8668f-r5cl6\" (UID: \"5615afd6-9159-4ccc-b08d-4305b9b792bb\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.199869 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.209843 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" event={"ID":"5c3d4c45-0af9-4160-9c7f-db96f50e2210","Type":"ContainerStarted","Data":"418fbdb17872db490e5231ff3844fb5c86626ee5980f9feecbdb99111bac5499"} Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.213209 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2sk\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-kube-api-access-6q2sk\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.219819 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.237223 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.241317 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pt9q\" (UniqueName: \"kubernetes.io/projected/21b9331b-c0a6-4a19-a056-93eb683693df-kube-api-access-9pt9q\") pod \"control-plane-machine-set-operator-78cbb6b69f-7fgrb\" (UID: \"21b9331b-c0a6-4a19-a056-93eb683693df\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.247133 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwgrt\" (UniqueName: \"kubernetes.io/projected/98ef6550-85de-4b77-b522-865f342fdd21-kube-api-access-lwgrt\") pod \"openshift-apiserver-operator-796bbdcf4f-69r9h\" (UID: \"98ef6550-85de-4b77-b522-865f342fdd21\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.247410 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.264901 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlnv4\" (UniqueName: \"kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4\") pod \"route-controller-manager-6576b87f9c-7kqgs\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.284334 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8033eae2-307c-42a9-b2b0-cac401f3add8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tgrqf\" (UID: \"8033eae2-307c-42a9-b2b0-cac401f3add8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.298688 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.309882 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ggw\" (UniqueName: \"kubernetes.io/projected/365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7-kube-api-access-p6ggw\") pod \"apiserver-76f77b778f-kgdkp\" (UID: \"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7\") " pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.320025 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.355579 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bw5\" (UniqueName: \"kubernetes.io/projected/263c0239-456c-4076-ae02-eb2a4abc33e4-kube-api-access-96bw5\") pod \"dns-default-lzg58\" (UID: \"263c0239-456c-4076-ae02-eb2a4abc33e4\") " pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.368523 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtcj\" (UniqueName: \"kubernetes.io/projected/efb9114f-122a-4b76-8763-4dd9dd908707-kube-api-access-6qtcj\") pod \"machine-config-controller-84d6567774-c9g4b\" (UID: \"efb9114f-122a-4b76-8763-4dd9dd908707\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.377211 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.381087 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.396169 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b303b57a-e6e8-40da-8fa9-56e11e1a948b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w996l\" (UID: \"b303b57a-e6e8-40da-8fa9-56e11e1a948b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.404626 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jfdb"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.408203 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xksqm\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-kube-api-access-xksqm\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.420186 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwv9\" (UniqueName: \"kubernetes.io/projected/90968a16-faa8-4ec7-a565-4d3cb75ed1dd-kube-api-access-9wwv9\") pod \"dns-operator-744455d44c-pnrgl\" (UID: \"90968a16-faa8-4ec7-a565-4d3cb75ed1dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.434343 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.441010 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdzx\" (UniqueName: \"kubernetes.io/projected/da00f555-fb9c-4329-b263-5d1e3c31180e-kube-api-access-5sdzx\") pod \"etcd-operator-b45778765-87dv2\" (UID: \"da00f555-fb9c-4329-b263-5d1e3c31180e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.459263 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6kc6m\" (UID: \"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.481172 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wck4\" (UniqueName: \"kubernetes.io/projected/4af6ea3c-7da8-465d-b056-9b9f0a67f626-kube-api-access-8wck4\") pod \"machine-config-operator-74547568cd-tqjpt\" (UID: \"4af6ea3c-7da8-465d-b056-9b9f0a67f626\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:32 crc kubenswrapper[4777]: W0216 21:40:32.481281 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80ba78f_7361_4793_911d_c2db9761916e.slice/crio-29c94714701952d2b38e1f3c8b8e9e09e2062e771067be25b62dcfa5d06274b5 WatchSource:0}: Error finding container 29c94714701952d2b38e1f3c8b8e9e09e2062e771067be25b62dcfa5d06274b5: Status 404 returned error can't find the container with id 29c94714701952d2b38e1f3c8b8e9e09e2062e771067be25b62dcfa5d06274b5 Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.485028 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.508929 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.509665 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/489bab1c-472b-4904-b4bb-3dc341f48f27-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wr74s\" (UID: \"489bab1c-472b-4904-b4bb-3dc341f48f27\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.524093 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.535137 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.604243 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605570 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605616 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605641 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605685 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605707 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605735 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b16579-167c-42bb-81f1-e688bff397b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605750 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-srv-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605779 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605794 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghq4\" (UniqueName: \"kubernetes.io/projected/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-kube-api-access-lghq4\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605830 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605881 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be3225d4-c821-48fc-8f4b-bfa603232b90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605905 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrnd\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605923 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt79l\" (UniqueName: \"kubernetes.io/projected/d8b16579-167c-42bb-81f1-e688bff397b3-kube-api-access-zt79l\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605957 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-key\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.605983 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpffr\" (UniqueName: \"kubernetes.io/projected/299577d6-9dd4-409c-bb21-c3fed59fd3bb-kube-api-access-kpffr\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606009 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3225d4-c821-48fc-8f4b-bfa603232b90-config\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606032 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b16579-167c-42bb-81f1-e688bff397b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606080 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606096 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3225d4-c821-48fc-8f4b-bfa603232b90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606155 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606172 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw7zz\" (UniqueName: \"kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606198 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606213 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-cabundle\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.606276 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnc8\" (UniqueName: \"kubernetes.io/projected/93605ec1-16ef-4ffc-aa50-8096ea8e8acc-kube-api-access-4rnc8\") pod \"migrator-59844c95c7-gg47z\" (UID: \"93605ec1-16ef-4ffc-aa50-8096ea8e8acc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" Feb 16 21:40:32 crc kubenswrapper[4777]: E0216 21:40:32.608446 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.108430754 +0000 UTC m=+153.690931856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.613850 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.622488 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.650509 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.675480 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.706946 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707177 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707256 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b16579-167c-42bb-81f1-e688bff397b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707282 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707307 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/9fee359b-721a-4ae4-96fd-1edd2b309712-kube-api-access-qgfgk\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707332 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-srv-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707371 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707387 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghq4\" (UniqueName: \"kubernetes.io/projected/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-kube-api-access-lghq4\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707435 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707455 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-cert\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707473 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-csi-data-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707499 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-plugins-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707535 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6810abc-8d3e-49d2-8a37-c52a22aee915-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707551 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6r7\" (UniqueName: \"kubernetes.io/projected/6845600c-842e-4274-86af-d8dc3ae0beb7-kube-api-access-tv6r7\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707607 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be3225d4-c821-48fc-8f4b-bfa603232b90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707636 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87kjn\" (UniqueName: \"kubernetes.io/projected/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-kube-api-access-87kjn\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707652 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrnd\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707667 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707702 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt79l\" (UniqueName: \"kubernetes.io/projected/d8b16579-167c-42bb-81f1-e688bff397b3-kube-api-access-zt79l\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707750 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkmw4\" (UniqueName: \"kubernetes.io/projected/d6810abc-8d3e-49d2-8a37-c52a22aee915-kube-api-access-mkmw4\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707768 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-webhook-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707834 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-key\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707851 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707896 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpffr\" (UniqueName: \"kubernetes.io/projected/299577d6-9dd4-409c-bb21-c3fed59fd3bb-kube-api-access-kpffr\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707914 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-node-bootstrap-token\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707943 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3225d4-c821-48fc-8f4b-bfa603232b90-config\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707978 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b16579-167c-42bb-81f1-e688bff397b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.707993 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-socket-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708056 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708074 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl2fv\" (UniqueName: \"kubernetes.io/projected/1233885f-29aa-46d2-8063-b88df7132010-kube-api-access-jl2fv\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708091 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3225d4-c821-48fc-8f4b-bfa603232b90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708156 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6gf\" (UniqueName: \"kubernetes.io/projected/3751ea5e-c531-40db-a0ac-b4110a0f4aed-kube-api-access-bm6gf\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708218 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708235 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw7zz\" (UniqueName: \"kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708298 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708316 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-cabundle\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708423 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnc8\" (UniqueName: \"kubernetes.io/projected/93605ec1-16ef-4ffc-aa50-8096ea8e8acc-kube-api-access-4rnc8\") pod \"migrator-59844c95c7-gg47z\" (UID: \"93605ec1-16ef-4ffc-aa50-8096ea8e8acc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708459 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvkj\" (UniqueName: \"kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708476 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-mountpoint-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708501 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxm4l\" (UniqueName: \"kubernetes.io/projected/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-kube-api-access-lxm4l\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708538 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cnph\" (UniqueName: \"kubernetes.io/projected/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-kube-api-access-2cnph\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708596 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fee359b-721a-4ae4-96fd-1edd2b309712-config\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708613 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6845600c-842e-4274-86af-d8dc3ae0beb7-tmpfs\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708684 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-apiservice-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708702 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fee359b-721a-4ae4-96fd-1edd2b309712-serving-cert\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708735 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708751 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-srv-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708805 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708821 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-certs\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708840 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.708885 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-registration-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: E0216 21:40:32.709931 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.209896859 +0000 UTC m=+153.792397961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.710564 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.712042 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-cabundle\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.714179 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.714489 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-srv-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.716311 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.716971 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.717921 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b16579-167c-42bb-81f1-e688bff397b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.718857 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3225d4-c821-48fc-8f4b-bfa603232b90-config\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.721621 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.721755 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.722446 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3225d4-c821-48fc-8f4b-bfa603232b90-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.722484 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-profile-collector-cert\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.722977 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8b16579-167c-42bb-81f1-e688bff397b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.736734 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.740332 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/299577d6-9dd4-409c-bb21-c3fed59fd3bb-signing-key\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.743793 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.767815 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw7zz\" (UniqueName: \"kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz\") pod \"marketplace-operator-79b997595-dfdd6\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.782924 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.800095 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghq4\" (UniqueName: \"kubernetes.io/projected/e8953b31-9a7b-4291-8ef8-02e3eb34f0d8-kube-api-access-lghq4\") pod \"catalog-operator-68c6474976-4xwfd\" (UID: \"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.801441 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810592 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-srv-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810632 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810658 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-certs\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810701 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-registration-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810744 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/9fee359b-721a-4ae4-96fd-1edd2b309712-kube-api-access-qgfgk\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810772 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810786 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810801 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-cert\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810817 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-csi-data-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810830 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-plugins-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810847 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6810abc-8d3e-49d2-8a37-c52a22aee915-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810863 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6r7\" (UniqueName: \"kubernetes.io/projected/6845600c-842e-4274-86af-d8dc3ae0beb7-kube-api-access-tv6r7\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810888 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87kjn\" (UniqueName: \"kubernetes.io/projected/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-kube-api-access-87kjn\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810920 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810965 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkmw4\" (UniqueName: \"kubernetes.io/projected/d6810abc-8d3e-49d2-8a37-c52a22aee915-kube-api-access-mkmw4\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.810987 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-webhook-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811011 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811042 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-node-bootstrap-token\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811072 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-socket-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811101 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl2fv\" (UniqueName: \"kubernetes.io/projected/1233885f-29aa-46d2-8063-b88df7132010-kube-api-access-jl2fv\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811119 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6gf\" (UniqueName: \"kubernetes.io/projected/3751ea5e-c531-40db-a0ac-b4110a0f4aed-kube-api-access-bm6gf\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811180 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvkj\" (UniqueName: \"kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811214 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-mountpoint-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811241 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxm4l\" (UniqueName: \"kubernetes.io/projected/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-kube-api-access-lxm4l\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811267 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cnph\" (UniqueName: \"kubernetes.io/projected/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-kube-api-access-2cnph\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:32 crc kubenswrapper[4777]: E0216 21:40:32.811288 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.31126659 +0000 UTC m=+153.893767692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811340 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fee359b-721a-4ae4-96fd-1edd2b309712-config\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811373 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6845600c-842e-4274-86af-d8dc3ae0beb7-tmpfs\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811407 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-apiservice-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.811429 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fee359b-721a-4ae4-96fd-1edd2b309712-serving-cert\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.812804 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.813326 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-plugins-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.814334 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-csi-data-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.817993 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.820592 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-socket-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.820801 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6845600c-842e-4274-86af-d8dc3ae0beb7-tmpfs\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.820900 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-registration-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.822163 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-mountpoint-dir\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.823949 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fee359b-721a-4ae4-96fd-1edd2b309712-config\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.827456 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.837561 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.845920 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-webhook-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.847087 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fee359b-721a-4ae4-96fd-1edd2b309712-serving-cert\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.848235 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d6810abc-8d3e-49d2-8a37-c52a22aee915-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.849547 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-node-bootstrap-token\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.850139 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.851586 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1233885f-29aa-46d2-8063-b88df7132010-srv-cert\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.851927 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-cert\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.852729 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.855409 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fb8q7"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.856886 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.867729 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnc8\" (UniqueName: \"kubernetes.io/projected/93605ec1-16ef-4ffc-aa50-8096ea8e8acc-kube-api-access-4rnc8\") pod \"migrator-59844c95c7-gg47z\" (UID: \"93605ec1-16ef-4ffc-aa50-8096ea8e8acc\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.869385 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.869793 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3751ea5e-c531-40db-a0ac-b4110a0f4aed-certs\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.870338 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6845600c-842e-4274-86af-d8dc3ae0beb7-apiservice-cert\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.878337 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be3225d4-c821-48fc-8f4b-bfa603232b90-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4zqxx\" (UID: \"be3225d4-c821-48fc-8f4b-bfa603232b90\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.878362 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpffr\" (UniqueName: \"kubernetes.io/projected/299577d6-9dd4-409c-bb21-c3fed59fd3bb-kube-api-access-kpffr\") pod \"service-ca-9c57cc56f-bcpwn\" (UID: \"299577d6-9dd4-409c-bb21-c3fed59fd3bb\") " pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.885533 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt79l\" (UniqueName: \"kubernetes.io/projected/d8b16579-167c-42bb-81f1-e688bff397b3-kube-api-access-zt79l\") pod \"kube-storage-version-migrator-operator-b67b599dd-j5x2r\" (UID: \"d8b16579-167c-42bb-81f1-e688bff397b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.907727 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrnd\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.912143 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:32 crc kubenswrapper[4777]: E0216 21:40:32.912543 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.412519837 +0000 UTC m=+153.995020939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.925443 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.933430 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.933826 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.937629 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r5cl6"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.941820 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6r7\" (UniqueName: \"kubernetes.io/projected/6845600c-842e-4274-86af-d8dc3ae0beb7-kube-api-access-tv6r7\") pod \"packageserver-d55dfcdfc-x8jn4\" (UID: \"6845600c-842e-4274-86af-d8dc3ae0beb7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.951632 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5pdvt"] Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.960834 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkmw4\" (UniqueName: \"kubernetes.io/projected/d6810abc-8d3e-49d2-8a37-c52a22aee915-kube-api-access-mkmw4\") pod \"multus-admission-controller-857f4d67dd-kpqz8\" (UID: \"d6810abc-8d3e-49d2-8a37-c52a22aee915\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.964621 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" Feb 16 21:40:32 crc kubenswrapper[4777]: I0216 21:40:32.970825 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:32.999891 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.000014 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.000113 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.007660 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87kjn\" (UniqueName: \"kubernetes.io/projected/b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6-kube-api-access-87kjn\") pod \"csi-hostpathplugin-j4h22\" (UID: \"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6\") " pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.007669 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cnph\" (UniqueName: \"kubernetes.io/projected/c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a-kube-api-access-2cnph\") pod \"ingress-canary-kr6qs\" (UID: \"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a\") " pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.013293 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.013694 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.513673841 +0000 UTC m=+154.096174943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.030510 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl2fv\" (UniqueName: \"kubernetes.io/projected/1233885f-29aa-46d2-8063-b88df7132010-kube-api-access-jl2fv\") pod \"olm-operator-6b444d44fb-xrkzz\" (UID: \"1233885f-29aa-46d2-8063-b88df7132010\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.038699 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvkj\" (UniqueName: \"kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj\") pod \"collect-profiles-29521290-h7lml\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.041942 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.055115 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.066447 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.067509 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6gf\" (UniqueName: \"kubernetes.io/projected/3751ea5e-c531-40db-a0ac-b4110a0f4aed-kube-api-access-bm6gf\") pod \"machine-config-server-9skvx\" (UID: \"3751ea5e-c531-40db-a0ac-b4110a0f4aed\") " pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.073112 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.080583 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.084334 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfgk\" (UniqueName: \"kubernetes.io/projected/9fee359b-721a-4ae4-96fd-1edd2b309712-kube-api-access-qgfgk\") pod \"service-ca-operator-777779d784-bwxqt\" (UID: \"9fee359b-721a-4ae4-96fd-1edd2b309712\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.088388 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9skvx" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.095168 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kr6qs" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.109688 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxm4l\" (UniqueName: \"kubernetes.io/projected/0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346-kube-api-access-lxm4l\") pod \"package-server-manager-789f6589d5-j9d2t\" (UID: \"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.116035 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-phc5s"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.116231 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.116634 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.616614585 +0000 UTC m=+154.199115687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.117013 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.137642 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzg58"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.212125 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" event={"ID":"027a50f3-fb4e-491e-b410-9d2ed7b4b836","Type":"ContainerStarted","Data":"6bf4a8720a38bfa79fb930255b7786e5a97901c5d5bb50c0e7ca6a04b18ec69a"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.217853 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.218377 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.718351248 +0000 UTC m=+154.300852350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.219826 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" event={"ID":"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928","Type":"ContainerStarted","Data":"ee0500faa93a56d508adabd4ef9a76c8f711cc392fc685d93bedbf953856bcca"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.230367 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" event={"ID":"fc7762e5-2274-4c99-8d8c-0fb23340150f","Type":"ContainerStarted","Data":"0a7db6fe8ab82d3a0427e64d4bb252e957376e67e93c0e845b99f5c8955ede06"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.232303 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" event={"ID":"5c3d4c45-0af9-4160-9c7f-db96f50e2210","Type":"ContainerStarted","Data":"86fb8d3a4214e82604e03fc6410060778d1b017e8b4bf49c0b15ffb48ae9e730"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.232328 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" event={"ID":"5c3d4c45-0af9-4160-9c7f-db96f50e2210","Type":"ContainerStarted","Data":"a47f8e0bd7dcf061f991810fd0c507b1f030613a214d59546ece9b96d85a81d2"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.234212 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" event={"ID":"63641d03-7d3e-4b50-a5da-722b2b5b97c2","Type":"ContainerStarted","Data":"da4a622d10d6d690486a934f799a9bfcc310f7bfd9ec9cd4ed75f38b66ac87a3"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.248305 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" event={"ID":"ca8070e1-ca1b-4716-9e12-c72f37e21f95","Type":"ContainerStarted","Data":"19f0d9b6e80eb6c01d1e140f35a5f9153c1bc721e45b9659d51cb6076511f5ee"} Feb 16 21:40:33 crc kubenswrapper[4777]: W0216 21:40:33.251420 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263c0239_456c_4076_ae02_eb2a4abc33e4.slice/crio-befa9102e1c086e6fd36bd2d609a35b54506d6904744742a27f2df2c7841f286 WatchSource:0}: Error finding container befa9102e1c086e6fd36bd2d609a35b54506d6904744742a27f2df2c7841f286: Status 404 returned error can't find the container with id befa9102e1c086e6fd36bd2d609a35b54506d6904744742a27f2df2c7841f286 Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.252672 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" event={"ID":"98ef6550-85de-4b77-b522-865f342fdd21","Type":"ContainerStarted","Data":"a6ffe806fc99d02d6919891b9a0fc522fe8d29a99222023dcdf5f0cc5505bfc5"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.286134 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l29zb" event={"ID":"1457b00a-86eb-457e-99a3-bf4bb271c513","Type":"ContainerStarted","Data":"0e3ed7132a1f236fc48f230742a6a54047140b890ffd77bfa30bd841454bd7e0"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.286231 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l29zb" event={"ID":"1457b00a-86eb-457e-99a3-bf4bb271c513","Type":"ContainerStarted","Data":"ee98d2f1463873f423b3be83b16f8c9247905029cb67e31ad7fd5147da5d1fac"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.303150 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" event={"ID":"8b1dfd22-6fa0-4096-97e4-871e6b65ad30","Type":"ContainerStarted","Data":"b3dfcf502d9aa807a54f482bd82abb7d95bac83a9350828ec5612819d9b62b78"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.304952 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" event={"ID":"f80ba78f-7361-4793-911d-c2db9761916e","Type":"ContainerStarted","Data":"6f587508065095308b7dc35fbc82ab11c2f51dabecc9b94bbbe497b0931fd674"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.304978 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" event={"ID":"f80ba78f-7361-4793-911d-c2db9761916e","Type":"ContainerStarted","Data":"29c94714701952d2b38e1f3c8b8e9e09e2062e771067be25b62dcfa5d06274b5"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.305244 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.305627 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" event={"ID":"5615afd6-9159-4ccc-b08d-4305b9b792bb","Type":"ContainerStarted","Data":"becce2790bde4b4a84aaf0ad726739a96d057aa6094999512b99db44c16224ef"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.309048 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" event={"ID":"fc4b8a4b-737b-411b-a707-fd1e086b685e","Type":"ContainerStarted","Data":"37354095c0e4a484c8caac3bbb56ba9ea2913fb8530adcabc5b7f6ed830bacb2"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.318737 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.319078 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.819048447 +0000 UTC m=+154.401549549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.325596 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" event={"ID":"3153ee9e-fa94-4bb7-8506-16778984382b","Type":"ContainerStarted","Data":"2c67c1948522f06ed3c4028396f19d5a4e363aee17a9fe922884965231d1b254"} Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.327594 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.335131 4777 patch_prober.go:28] interesting pod/console-operator-58897d9998-8jfdb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.335183 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" podUID="f80ba78f-7361-4793-911d-c2db9761916e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.355607 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.355669 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.356978 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.359843 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.382123 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.383359 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.389869 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.399804 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kgdkp"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.400988 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf"] Feb 16 21:40:33 crc kubenswrapper[4777]: W0216 21:40:33.410765 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3751ea5e_c531_40db_a0ac_b4110a0f4aed.slice/crio-574da8ba6db3c5ecff1a50305d8153b53483c39f843f0cef9fa4ca4bd5178d92 WatchSource:0}: Error finding container 574da8ba6db3c5ecff1a50305d8153b53483c39f843f0cef9fa4ca4bd5178d92: Status 404 returned error can't find the container with id 574da8ba6db3c5ecff1a50305d8153b53483c39f843f0cef9fa4ca4bd5178d92 Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.419970 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.420283 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:33.920269203 +0000 UTC m=+154.502770305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.463911 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.464165 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87dv2"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.477318 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.485799 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.503565 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnrgl"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.519140 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.520639 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.520818 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.020782805 +0000 UTC m=+154.603283907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.522657 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.523057 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.023036551 +0000 UTC m=+154.605537643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.545109 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kpqz8"] Feb 16 21:40:33 crc kubenswrapper[4777]: W0216 21:40:33.552487 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe3225d4_c821_48fc_8f4b_bfa603232b90.slice/crio-e5ae99a0ce0ea6cbc480ad9ea6950914e82dbfd9daa0038e5a335282d4a2edc6 WatchSource:0}: Error finding container e5ae99a0ce0ea6cbc480ad9ea6950914e82dbfd9daa0038e5a335282d4a2edc6: Status 404 returned error can't find the container with id e5ae99a0ce0ea6cbc480ad9ea6950914e82dbfd9daa0038e5a335282d4a2edc6 Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.624391 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.625145 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.125126897 +0000 UTC m=+154.707627989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.645110 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.676494 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-bcpwn"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.725839 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.726255 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.226238379 +0000 UTC m=+154.808739481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.831569 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.832265 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.332243657 +0000 UTC m=+154.914744759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.877338 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.925586 4777 csr.go:261] certificate signing request csr-qkjjr is approved, waiting to be issued Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.933440 4777 csr.go:257] certificate signing request csr-qkjjr is issued Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.935503 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:33 crc kubenswrapper[4777]: E0216 21:40:33.935849 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.435836243 +0000 UTC m=+155.018337345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.954868 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j4h22"] Feb 16 21:40:33 crc kubenswrapper[4777]: I0216 21:40:33.976648 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.002976 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz"] Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.036766 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.037274 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.537227125 +0000 UTC m=+155.119728227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.060902 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kr6qs"] Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.107850 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t"] Feb 16 21:40:34 crc kubenswrapper[4777]: W0216 21:40:34.122700 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1233885f_29aa_46d2_8063_b88df7132010.slice/crio-200926e65a3b677bbcda6b775e2fe725f7cfc8325032908cc1eab21301c8e616 WatchSource:0}: Error finding container 200926e65a3b677bbcda6b775e2fe725f7cfc8325032908cc1eab21301c8e616: Status 404 returned error can't find the container with id 200926e65a3b677bbcda6b775e2fe725f7cfc8325032908cc1eab21301c8e616 Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.140134 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.140505 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.64049076 +0000 UTC m=+155.222991862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: W0216 21:40:34.142015 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf589f2_0447_4431_92cc_73956a345b44.slice/crio-5dba7c803ea67153cc734e188a62ee961d392e5c18f053cb2001a65a8ff98c2b WatchSource:0}: Error finding container 5dba7c803ea67153cc734e188a62ee961d392e5c18f053cb2001a65a8ff98c2b: Status 404 returned error can't find the container with id 5dba7c803ea67153cc734e188a62ee961d392e5c18f053cb2001a65a8ff98c2b Feb 16 21:40:34 crc kubenswrapper[4777]: W0216 21:40:34.156168 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d7882e_80c4_45b9_aa97_c2dbe0bcbe8a.slice/crio-3189151f0637ed8a3af6a3a637cac09f1be1fa64edb516336f787a441dc93a9f WatchSource:0}: Error finding container 3189151f0637ed8a3af6a3a637cac09f1be1fa64edb516336f787a441dc93a9f: Status 404 returned error can't find the container with id 3189151f0637ed8a3af6a3a637cac09f1be1fa64edb516336f787a441dc93a9f Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.178298 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt"] Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.238656 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.240876 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.241052 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.741000222 +0000 UTC m=+155.323501324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.241123 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.241665 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.741598752 +0000 UTC m=+155.324099854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.245089 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:34 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:34 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:34 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.245128 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:34 crc kubenswrapper[4777]: W0216 21:40:34.262348 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fee359b_721a_4ae4_96fd_1edd2b309712.slice/crio-480f4d3cb54b7d0f7487df6d4b00bc501c0079dd8c60ebb4f47eae449f54f097 WatchSource:0}: Error finding container 480f4d3cb54b7d0f7487df6d4b00bc501c0079dd8c60ebb4f47eae449f54f097: Status 404 returned error can't find the container with id 480f4d3cb54b7d0f7487df6d4b00bc501c0079dd8c60ebb4f47eae449f54f097 Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.342300 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.342489 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.842452046 +0000 UTC m=+155.424953148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.342650 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.343034 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.843020865 +0000 UTC m=+155.425521967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.357811 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzg58" event={"ID":"263c0239-456c-4076-ae02-eb2a4abc33e4","Type":"ContainerStarted","Data":"befa9102e1c086e6fd36bd2d609a35b54506d6904744742a27f2df2c7841f286"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.361017 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxnqn" event={"ID":"7575698b-ab60-49ed-9d95-744b540314a5","Type":"ContainerStarted","Data":"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.361047 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxnqn" event={"ID":"7575698b-ab60-49ed-9d95-744b540314a5","Type":"ContainerStarted","Data":"fb6f1eae3841a289e5e9643c141a1605b01a28f7b06e6965ef4a3e3dc48aae83"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.368181 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" event={"ID":"8033eae2-307c-42a9-b2b0-cac401f3add8","Type":"ContainerStarted","Data":"c00902aa864e16ac92e17f485831cb689a0440d92d3dce8da85df7528a10d730"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.368216 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" event={"ID":"8033eae2-307c-42a9-b2b0-cac401f3add8","Type":"ContainerStarted","Data":"edbd04f56667fcf66a24eb359222bc9bb5c90e1e9c142af587af31f758fb92a0"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.374128 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" event={"ID":"d8b16579-167c-42bb-81f1-e688bff397b3","Type":"ContainerStarted","Data":"102b8a061180763b71e16facaa27b099d830c29f0179eec0717bb9bdca8c24a1"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.377863 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" event={"ID":"ca8070e1-ca1b-4716-9e12-c72f37e21f95","Type":"ContainerStarted","Data":"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.378212 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.379851 4777 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sx74s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.379894 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.382006 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" event={"ID":"93605ec1-16ef-4ffc-aa50-8096ea8e8acc","Type":"ContainerStarted","Data":"3ca76e2f11fe77e96e2a1c7f39398270f655738eec709b6fb08216e1110196cf"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.387045 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" event={"ID":"6845600c-842e-4274-86af-d8dc3ae0beb7","Type":"ContainerStarted","Data":"ba7f0161edddac0e65599a5a658c6b482d486d2aa5423dbc5d1237226459a1e1"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.389639 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" event={"ID":"027a50f3-fb4e-491e-b410-9d2ed7b4b836","Type":"ContainerStarted","Data":"a5bfa1247a40d22547427091100e0829a7d8b646fef16faeff94c5c957409f74"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.391352 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" event={"ID":"4af6ea3c-7da8-465d-b056-9b9f0a67f626","Type":"ContainerStarted","Data":"f07a6188b6f143427b155a36208318f44c2c01f82529b69f42d565ca1c80525c"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.393231 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" event={"ID":"efb9114f-122a-4b76-8763-4dd9dd908707","Type":"ContainerStarted","Data":"98cf7d80cdc038a8546ccd65769a3f1abcceb1f266983565dd13f65999b5253d"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.395485 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" event={"ID":"9fee359b-721a-4ae4-96fd-1edd2b309712","Type":"ContainerStarted","Data":"480f4d3cb54b7d0f7487df6d4b00bc501c0079dd8c60ebb4f47eae449f54f097"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.399287 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" event={"ID":"c1ddbbc5-95a5-4f4f-8a1d-0442229c0928","Type":"ContainerStarted","Data":"812fbfd07b40b2b5a41e1abcd3fd9d7a043758582fc1136c9a44517655c4deee"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.406465 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" event={"ID":"98ef6550-85de-4b77-b522-865f342fdd21","Type":"ContainerStarted","Data":"4da66642d5f80bf9f62e0a6a46b7d3d4a60d5808d02195ad392d429707f10653"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.412494 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9skvx" event={"ID":"3751ea5e-c531-40db-a0ac-b4110a0f4aed","Type":"ContainerStarted","Data":"574da8ba6db3c5ecff1a50305d8153b53483c39f843f0cef9fa4ca4bd5178d92"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.424311 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-phc5s" event={"ID":"29e22b2d-4f29-4258-8b81-31adeae29a3c","Type":"ContainerStarted","Data":"03371fbe6ed282307dcc89446e5fe0cc20af8d181f5e35eea429dd822c4b3e7f"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.424366 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-phc5s" event={"ID":"29e22b2d-4f29-4258-8b81-31adeae29a3c","Type":"ContainerStarted","Data":"3088b843a33aa5d75d01b2a11cf3c4b747be900bd6fe9b2f0ad4a0e0893670b2"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.425329 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.428691 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" event={"ID":"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a","Type":"ContainerStarted","Data":"32ec29f91c61fa307795e12c942ea2ca49e3419a75d6853f9950418ef404508b"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.433302 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" event={"ID":"fc4b8a4b-737b-411b-a707-fd1e086b685e","Type":"ContainerStarted","Data":"7cd699233b5432f3bc074364ff468cd0fe9e32170ea24dc1aaf20a9d0315422e"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.438726 4777 patch_prober.go:28] interesting pod/downloads-7954f5f757-phc5s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.438785 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-phc5s" podUID="29e22b2d-4f29-4258-8b81-31adeae29a3c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.443443 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.448334 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" event={"ID":"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7","Type":"ContainerStarted","Data":"ae9420b609c77d3a759b7016e4b5d3920c54a36ccc79c2e775bb8ad17d6f5190"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.452910 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" event={"ID":"1233885f-29aa-46d2-8063-b88df7132010","Type":"ContainerStarted","Data":"200926e65a3b677bbcda6b775e2fe725f7cfc8325032908cc1eab21301c8e616"} Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.454590 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:34.94421618 +0000 UTC m=+155.526717282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.456380 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" event={"ID":"4eadd010-ae18-4453-8810-1f8edf434cb3","Type":"ContainerStarted","Data":"e426f29730b689a9b57a0428c64699d4b5deedc47b96ebc7c2d58bc870308487"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.470472 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" event={"ID":"90968a16-faa8-4ec7-a565-4d3cb75ed1dd","Type":"ContainerStarted","Data":"4a86a20406b684ce56ae59e4d3cbc926ba7b46093c0aee5651ce7f90528e45ca"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.474777 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" event={"ID":"63641d03-7d3e-4b50-a5da-722b2b5b97c2","Type":"ContainerStarted","Data":"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.475336 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.477377 4777 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7kqgs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.477417 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.478951 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" event={"ID":"5615afd6-9159-4ccc-b08d-4305b9b792bb","Type":"ContainerStarted","Data":"c119278ff6b463a65b1c1326026ea8d444401b3020a77cffddf7a9d385010fc1"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.484835 4777 generic.go:334] "Generic (PLEG): container finished" podID="8b1dfd22-6fa0-4096-97e4-871e6b65ad30" containerID="ae71e459ca46f3d61ba0b6fba10ecc9248d75a2f247ecf2c15787aa8557b7052" exitCode=0 Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.484895 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" event={"ID":"8b1dfd22-6fa0-4096-97e4-871e6b65ad30","Type":"ContainerDied","Data":"ae71e459ca46f3d61ba0b6fba10ecc9248d75a2f247ecf2c15787aa8557b7052"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.492326 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" event={"ID":"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6","Type":"ContainerStarted","Data":"47118df0f4396bf81aa451c2c2200b09aaac545dc9d62ac596969bc2fa17db0c"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.494939 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" event={"ID":"be3225d4-c821-48fc-8f4b-bfa603232b90","Type":"ContainerStarted","Data":"e5ae99a0ce0ea6cbc480ad9ea6950914e82dbfd9daa0038e5a335282d4a2edc6"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.503680 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" event={"ID":"489bab1c-472b-4904-b4bb-3dc341f48f27","Type":"ContainerStarted","Data":"b073c509846648738402a44c849f451070d421d2165f1d784f19a6fbf14cba05"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.508262 4777 generic.go:334] "Generic (PLEG): container finished" podID="3153ee9e-fa94-4bb7-8506-16778984382b" containerID="9af3fb9cfe46247f60c0b5ae9060c6bf0931f4da94ba021f18131c01de15db43" exitCode=0 Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.508432 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" event={"ID":"3153ee9e-fa94-4bb7-8506-16778984382b","Type":"ContainerDied","Data":"9af3fb9cfe46247f60c0b5ae9060c6bf0931f4da94ba021f18131c01de15db43"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.514027 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kr6qs" event={"ID":"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a","Type":"ContainerStarted","Data":"3189151f0637ed8a3af6a3a637cac09f1be1fa64edb516336f787a441dc93a9f"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.523266 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerStarted","Data":"5dba7c803ea67153cc734e188a62ee961d392e5c18f053cb2001a65a8ff98c2b"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.525479 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" event={"ID":"d6810abc-8d3e-49d2-8a37-c52a22aee915","Type":"ContainerStarted","Data":"e5c9ab2a066ec85ee88a34c3369464bec7ef7a2533acdc4e27a4e94faa96be33"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.529988 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" event={"ID":"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346","Type":"ContainerStarted","Data":"84c3e9cb059b82d34fe52b660f0b6be2be117ed8508d8e858612a30bddf23086"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.533576 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" event={"ID":"da00f555-fb9c-4329-b263-5d1e3c31180e","Type":"ContainerStarted","Data":"dbb6e9f8e79ec2bfb21eeac664548204739508975c486da2afb8232cc4664806"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.538786 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" event={"ID":"299577d6-9dd4-409c-bb21-c3fed59fd3bb","Type":"ContainerStarted","Data":"f024c9ce287dffdb57271546d662669cc541f0dc7ad5ca20bd70388e395b4666"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.542456 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" event={"ID":"b303b57a-e6e8-40da-8fa9-56e11e1a948b","Type":"ContainerStarted","Data":"cd5b23bee5d93402508650412940bf4e1d5749aeee71504e3a4dc777b2dab82e"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.544268 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" event={"ID":"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8","Type":"ContainerStarted","Data":"86e4d187fa19f80e823c3147cb0f5f1bef72dba31ae5b665c8e873c7fad22f1e"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.546016 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" event={"ID":"fc7762e5-2274-4c99-8d8c-0fb23340150f","Type":"ContainerStarted","Data":"46a9bdcbc65b5e3b249f67d2b481d6168376ffbeb75dc0763ce7429c09097bc9"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.547332 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.549727 4777 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5pdvt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" start-of-body= Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.549797 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.550850 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" event={"ID":"21b9331b-c0a6-4a19-a056-93eb683693df","Type":"ContainerStarted","Data":"1678fc42ffef5e779697281c150af24689a8d0a8ce32ce7be7c01661c9cb76ce"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.550953 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" event={"ID":"21b9331b-c0a6-4a19-a056-93eb683693df","Type":"ContainerStarted","Data":"7f3f955c9bdf203b41696f8ace7715e3b303bc2d54572f7c978d018a69397fe2"} Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.555876 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.556601 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.558183 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.058165337 +0000 UTC m=+155.640666439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.662535 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.663425 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.163391289 +0000 UTC m=+155.745892401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.664679 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.670533 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.17050602 +0000 UTC m=+155.753007302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.797190 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tgrqf" podStartSLOduration=133.797167997 podStartE2EDuration="2m13.797167997s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.792902793 +0000 UTC m=+155.375403895" watchObservedRunningTime="2026-02-16 21:40:34.797167997 +0000 UTC m=+155.379669099" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.797330 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" podStartSLOduration=133.797323522 podStartE2EDuration="2m13.797323522s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.741740031 +0000 UTC m=+155.324241133" watchObservedRunningTime="2026-02-16 21:40:34.797323522 +0000 UTC m=+155.379824624" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.802801 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.302774577 +0000 UTC m=+155.885275669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.802853 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.803404 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.803969 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.303956837 +0000 UTC m=+155.886457939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.818252 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7fgrb" podStartSLOduration=132.81823327 podStartE2EDuration="2m12.81823327s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.816993718 +0000 UTC m=+155.399494830" watchObservedRunningTime="2026-02-16 21:40:34.81823327 +0000 UTC m=+155.400734372" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.857604 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4kb9f" podStartSLOduration=133.857584002 podStartE2EDuration="2m13.857584002s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.857386695 +0000 UTC m=+155.439887797" watchObservedRunningTime="2026-02-16 21:40:34.857584002 +0000 UTC m=+155.440085104" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.903518 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8jfdb" podStartSLOduration=133.903500586 podStartE2EDuration="2m13.903500586s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.901029072 +0000 UTC m=+155.483530174" watchObservedRunningTime="2026-02-16 21:40:34.903500586 +0000 UTC m=+155.486001688" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.904905 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:34 crc kubenswrapper[4777]: E0216 21:40:34.905182 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.405167922 +0000 UTC m=+155.987669024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.907379 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.934959 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-16 21:35:33 +0000 UTC, rotation deadline is 2026-12-02 18:12:25.92278858 +0000 UTC Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.935094 4777 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6932h31m50.987697875s for next certificate rotation Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.940189 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" podStartSLOduration=133.940164357 podStartE2EDuration="2m13.940164357s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.938836552 +0000 UTC m=+155.521337654" watchObservedRunningTime="2026-02-16 21:40:34.940164357 +0000 UTC m=+155.522665459" Feb 16 21:40:34 crc kubenswrapper[4777]: I0216 21:40:34.987225 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-l29zb" podStartSLOduration=133.987198069 podStartE2EDuration="2m13.987198069s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:34.984674713 +0000 UTC m=+155.567175815" watchObservedRunningTime="2026-02-16 21:40:34.987198069 +0000 UTC m=+155.569699171" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.006398 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.007068 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.507052561 +0000 UTC m=+156.089553663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.022910 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-phc5s" podStartSLOduration=134.022890907 podStartE2EDuration="2m14.022890907s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.021048655 +0000 UTC m=+155.603549757" watchObservedRunningTime="2026-02-16 21:40:35.022890907 +0000 UTC m=+155.605392009" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.103014 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tcrg" podStartSLOduration=134.102988168 podStartE2EDuration="2m14.102988168s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.062165556 +0000 UTC m=+155.644666658" watchObservedRunningTime="2026-02-16 21:40:35.102988168 +0000 UTC m=+155.685489270" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.110761 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.110969 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.610942127 +0000 UTC m=+156.193443229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.111053 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.111420 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.611403693 +0000 UTC m=+156.193904785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.141985 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-69r9h" podStartSLOduration=134.141961996 podStartE2EDuration="2m14.141961996s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.105483403 +0000 UTC m=+155.687984505" watchObservedRunningTime="2026-02-16 21:40:35.141961996 +0000 UTC m=+155.724463098" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.144683 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fb8q7" podStartSLOduration=134.144667868 podStartE2EDuration="2m14.144667868s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.139898286 +0000 UTC m=+155.722399388" watchObservedRunningTime="2026-02-16 21:40:35.144667868 +0000 UTC m=+155.727168970" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.213979 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.214427 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.714400728 +0000 UTC m=+156.296901830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.228858 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" podStartSLOduration=134.228838617 podStartE2EDuration="2m14.228838617s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.226981104 +0000 UTC m=+155.809482206" watchObservedRunningTime="2026-02-16 21:40:35.228838617 +0000 UTC m=+155.811339709" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.249439 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:35 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:35 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:35 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.249949 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.272556 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rxnqn" podStartSLOduration=134.272533766 podStartE2EDuration="2m14.272533766s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.271645406 +0000 UTC m=+155.854146508" watchObservedRunningTime="2026-02-16 21:40:35.272533766 +0000 UTC m=+155.855034868" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.319858 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.320238 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.82022605 +0000 UTC m=+156.402727152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.422379 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.422918 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:35.922890615 +0000 UTC m=+156.505391717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.427128 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" podStartSLOduration=133.427106968 podStartE2EDuration="2m13.427106968s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.340998803 +0000 UTC m=+155.923499915" watchObservedRunningTime="2026-02-16 21:40:35.427106968 +0000 UTC m=+156.009608070" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.525177 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.525679 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.025659463 +0000 UTC m=+156.608160565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.630307 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.631059 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.1310372 +0000 UTC m=+156.713538302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.655455 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" event={"ID":"489bab1c-472b-4904-b4bb-3dc341f48f27","Type":"ContainerStarted","Data":"c034a63d1e6274f25ae63c8bfd487080099b9c9e19608c54832ea50ebec492ef"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.678115 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" event={"ID":"d6810abc-8d3e-49d2-8a37-c52a22aee915","Type":"ContainerStarted","Data":"30fded883d914a958e827a01f33c79cdf8f61d5e3086d0e6024eb144bc949843"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.732037 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.733409 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.233397345 +0000 UTC m=+156.815898447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.761762 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" event={"ID":"6845600c-842e-4274-86af-d8dc3ae0beb7","Type":"ContainerStarted","Data":"4240f200824c4cf9a6457fb38a9c318599c07020397388faaad08c61af82a889"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.762331 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.769690 4777 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x8jn4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.769769 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" podUID="6845600c-842e-4274-86af-d8dc3ae0beb7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.784610 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kr6qs" event={"ID":"c6d7882e-80c4-45b9-aa97-c2dbe0bcbe8a","Type":"ContainerStarted","Data":"7bf0de763b3c30970c4b10bb22a44860c245ec893a9db06e543193fcaf008676"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.821419 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" event={"ID":"299577d6-9dd4-409c-bb21-c3fed59fd3bb","Type":"ContainerStarted","Data":"30c822d94f6f0529a2c3990f97ba5dcf0530509b85ed8a3d7074125b13f0928a"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.836658 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.837272 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.337238149 +0000 UTC m=+156.919739251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.854545 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" event={"ID":"b303b57a-e6e8-40da-8fa9-56e11e1a948b","Type":"ContainerStarted","Data":"215ddcd9ea0ae7cffab758c8815d21e91f85e13d57db43598bd782867c84040b"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.857167 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" podStartSLOduration=133.857154774 podStartE2EDuration="2m13.857154774s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.85587937 +0000 UTC m=+156.438380472" watchObservedRunningTime="2026-02-16 21:40:35.857154774 +0000 UTC m=+156.439655876" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.857396 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" podStartSLOduration=134.857385621 podStartE2EDuration="2m14.857385621s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.702361744 +0000 UTC m=+156.284862846" watchObservedRunningTime="2026-02-16 21:40:35.857385621 +0000 UTC m=+156.439886723" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.882076 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" event={"ID":"6c4f6c2a-8c1c-4f48-b62a-40f1ed3a9d0a","Type":"ContainerStarted","Data":"1b44309e664bd41d70a4dcae6c3d84878ad156f72440f8f5faceaa74738d5508"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.908999 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" event={"ID":"1233885f-29aa-46d2-8063-b88df7132010","Type":"ContainerStarted","Data":"ea5fbe0c7a4a7a65b8e4b9976c68977962b65199c11d567ec0609e4492622225"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.910245 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.929343 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kr6qs" podStartSLOduration=5.929319296 podStartE2EDuration="5.929319296s" podCreationTimestamp="2026-02-16 21:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.927384851 +0000 UTC m=+156.509885953" watchObservedRunningTime="2026-02-16 21:40:35.929319296 +0000 UTC m=+156.511820398" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.934836 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9skvx" event={"ID":"3751ea5e-c531-40db-a0ac-b4110a0f4aed","Type":"ContainerStarted","Data":"1c152e3a9b1b0b3487b308eb2900727eff2170e219c4c61912f64d759ecab808"} Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.936461 4777 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xrkzz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.936574 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" podUID="1233885f-29aa-46d2-8063-b88df7132010" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.938247 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:35 crc kubenswrapper[4777]: E0216 21:40:35.944555 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.444537171 +0000 UTC m=+157.027038273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.977479 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w996l" podStartSLOduration=134.977455805 podStartE2EDuration="2m14.977455805s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:35.96075779 +0000 UTC m=+156.543258882" watchObservedRunningTime="2026-02-16 21:40:35.977455805 +0000 UTC m=+156.559956917" Feb 16 21:40:35 crc kubenswrapper[4777]: I0216 21:40:35.985133 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" event={"ID":"efb9114f-122a-4b76-8763-4dd9dd908707","Type":"ContainerStarted","Data":"16b1a3dd7b8f6a881285a0350647458ab974ea8634c8a34434b7bc09d8778f11"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.016849 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" event={"ID":"5615afd6-9159-4ccc-b08d-4305b9b792bb","Type":"ContainerStarted","Data":"46bac7483b07fd596fa1d3e524cf32771f8c2f60862dc3f126946f71cc6ed508"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.039933 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.041251 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.541226324 +0000 UTC m=+157.123727426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.058394 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-bcpwn" podStartSLOduration=134.058373144 podStartE2EDuration="2m14.058373144s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.014318323 +0000 UTC m=+156.596819425" watchObservedRunningTime="2026-02-16 21:40:36.058373144 +0000 UTC m=+156.640874236" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.059113 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" podStartSLOduration=134.059107889 podStartE2EDuration="2m14.059107889s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.05824789 +0000 UTC m=+156.640748992" watchObservedRunningTime="2026-02-16 21:40:36.059107889 +0000 UTC m=+156.641608991" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.067852 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" event={"ID":"d8b16579-167c-42bb-81f1-e688bff397b3","Type":"ContainerStarted","Data":"095e92fdd85708774305371171b7931ec0fd3bd31b645cee8e42c98e4aa7c5b6"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.087661 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" podStartSLOduration=134.087634385 podStartE2EDuration="2m14.087634385s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.087270002 +0000 UTC m=+156.669771104" watchObservedRunningTime="2026-02-16 21:40:36.087634385 +0000 UTC m=+156.670135487" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.097563 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" event={"ID":"e8953b31-9a7b-4291-8ef8-02e3eb34f0d8","Type":"ContainerStarted","Data":"f6f97b7b2d5b75334e4439c0e0e319f820fa1761a3bdb8a62e8dc249940d5abf"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.098666 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.114673 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6kc6m" podStartSLOduration=134.114652129 podStartE2EDuration="2m14.114652129s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.11259722 +0000 UTC m=+156.695098322" watchObservedRunningTime="2026-02-16 21:40:36.114652129 +0000 UTC m=+156.697153231" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.115643 4777 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4xwfd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.115680 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" podUID="e8953b31-9a7b-4291-8ef8-02e3eb34f0d8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.117921 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.151487 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.155171 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.65515347 +0000 UTC m=+157.237654572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.159561 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerStarted","Data":"d04996da245f1a942dff21e11727be6dd4105be864af59a55c7bc6ef30077764"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.160970 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.168763 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" event={"ID":"4eadd010-ae18-4453-8810-1f8edf434cb3","Type":"ContainerStarted","Data":"b652e68a557e6bf036b7d7c10e707aaeac80c326803214742d609c3164d2a186"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.171098 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9skvx" podStartSLOduration=7.171085769 podStartE2EDuration="7.171085769s" podCreationTimestamp="2026-02-16 21:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.168291245 +0000 UTC m=+156.750792347" watchObservedRunningTime="2026-02-16 21:40:36.171085769 +0000 UTC m=+156.753586871" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.184565 4777 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfdd6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.184624 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.195987 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" event={"ID":"90968a16-faa8-4ec7-a565-4d3cb75ed1dd","Type":"ContainerStarted","Data":"c60f1b59fef113d6deedadc25e87bb2fc72fae4e90b31aa79e67cfc6709dffb2"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.198576 4777 generic.go:334] "Generic (PLEG): container finished" podID="365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7" containerID="dc45cadf73c9abf1172185335a0a958fe5a5d4f4eb63c550375f5209aeea4500" exitCode=0 Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.198747 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" event={"ID":"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7","Type":"ContainerDied","Data":"dc45cadf73c9abf1172185335a0a958fe5a5d4f4eb63c550375f5209aeea4500"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.234537 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" event={"ID":"93605ec1-16ef-4ffc-aa50-8096ea8e8acc","Type":"ContainerStarted","Data":"632e281ff57fcdafdcbe35befa7ee546365b5ef01d30cd32c6ba96d11aa4cd58"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.236667 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-r5cl6" podStartSLOduration=134.236641118 podStartE2EDuration="2m14.236641118s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.19532839 +0000 UTC m=+156.777829492" watchObservedRunningTime="2026-02-16 21:40:36.236641118 +0000 UTC m=+156.819142230" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.236905 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" podStartSLOduration=135.236900267 podStartE2EDuration="2m15.236900267s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.230100077 +0000 UTC m=+156.812601179" watchObservedRunningTime="2026-02-16 21:40:36.236900267 +0000 UTC m=+156.819401379" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.257003 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.257361 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.757342019 +0000 UTC m=+157.339843111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.258170 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.258270 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-j5x2r" podStartSLOduration=134.25825963 podStartE2EDuration="2m14.25825963s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.256283473 +0000 UTC m=+156.838784605" watchObservedRunningTime="2026-02-16 21:40:36.25825963 +0000 UTC m=+156.840760732" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.258697 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:36 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:36 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:36 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.258787 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.261842 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" event={"ID":"be3225d4-c821-48fc-8f4b-bfa603232b90","Type":"ContainerStarted","Data":"a1fa82be1944f71dc730f64cd226ca917a0690cbce6552d90a1573eece7d5b6e"} Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.261856 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.761845641 +0000 UTC m=+157.344346743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.274966 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzg58" event={"ID":"263c0239-456c-4076-ae02-eb2a4abc33e4","Type":"ContainerStarted","Data":"ea893461b3395e661cb6e15de9eb4cad6de3b0ffcd7e774f8b73693ae17fdc7c"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.275775 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.325062 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" podStartSLOduration=134.32503918 podStartE2EDuration="2m14.32503918s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.281163075 +0000 UTC m=+156.863664177" watchObservedRunningTime="2026-02-16 21:40:36.32503918 +0000 UTC m=+156.907540282" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.325386 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" podStartSLOduration=134.325381442 podStartE2EDuration="2m14.325381442s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.325373321 +0000 UTC m=+156.907874423" watchObservedRunningTime="2026-02-16 21:40:36.325381442 +0000 UTC m=+156.907882544" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.332929 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" event={"ID":"027a50f3-fb4e-491e-b410-9d2ed7b4b836","Type":"ContainerStarted","Data":"59e73793a0337c4ce92ec4a07b6bebc30fab6a8d706aa69366c9fd39ed7a82db"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.349562 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" podStartSLOduration=134.349540789 podStartE2EDuration="2m14.349540789s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.348331459 +0000 UTC m=+156.930832561" watchObservedRunningTime="2026-02-16 21:40:36.349540789 +0000 UTC m=+156.932041891" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.354782 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" event={"ID":"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346","Type":"ContainerStarted","Data":"239e7ffd1a4cca2be8ef32f2c58ede8ca719c7cc689a8828ae7c7d37b1b3106d"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.361454 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.361638 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.361725 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.861691581 +0000 UTC m=+157.444192683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.362648 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.366174 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.866159412 +0000 UTC m=+157.448660514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.390567 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" event={"ID":"4af6ea3c-7da8-465d-b056-9b9f0a67f626","Type":"ContainerStarted","Data":"21bb509dd2e2ce0b0f1bec266204c4656b78316f440c5aeeb2dce9774f942ab1"} Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.391580 4777 patch_prober.go:28] interesting pod/downloads-7954f5f757-phc5s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.391754 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-phc5s" podUID="29e22b2d-4f29-4258-8b81-31adeae29a3c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.402246 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.410576 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.429007 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" podStartSLOduration=135.428983118 podStartE2EDuration="2m15.428983118s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.386311104 +0000 UTC m=+156.968812206" watchObservedRunningTime="2026-02-16 21:40:36.428983118 +0000 UTC m=+157.011484220" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.432178 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" podStartSLOduration=134.432168016 podStartE2EDuration="2m14.432168016s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.428695969 +0000 UTC m=+157.011197071" watchObservedRunningTime="2026-02-16 21:40:36.432168016 +0000 UTC m=+157.014669148" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.437949 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.465636 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.488951 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:36.988926837 +0000 UTC m=+157.571427939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.529697 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4zqxx" podStartSLOduration=135.529675726 podStartE2EDuration="2m15.529675726s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.529673636 +0000 UTC m=+157.112174758" watchObservedRunningTime="2026-02-16 21:40:36.529675726 +0000 UTC m=+157.112176818" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.532268 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lzg58" podStartSLOduration=7.532257574 podStartE2EDuration="7.532257574s" podCreationTimestamp="2026-02-16 21:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.479216179 +0000 UTC m=+157.061717281" watchObservedRunningTime="2026-02-16 21:40:36.532257574 +0000 UTC m=+157.114758676" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.568439 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.568862 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.068846632 +0000 UTC m=+157.651347734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.615066 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" podStartSLOduration=134.615044076 podStartE2EDuration="2m14.615044076s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.613355689 +0000 UTC m=+157.195856791" watchObservedRunningTime="2026-02-16 21:40:36.615044076 +0000 UTC m=+157.197545178" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.669294 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.669696 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.169676475 +0000 UTC m=+157.752177577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.725207 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n6q9g" podStartSLOduration=135.725184294 podStartE2EDuration="2m15.725184294s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.683032267 +0000 UTC m=+157.265533369" watchObservedRunningTime="2026-02-16 21:40:36.725184294 +0000 UTC m=+157.307685386" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.760410 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" podStartSLOduration=135.760387605 podStartE2EDuration="2m15.760387605s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.725110991 +0000 UTC m=+157.307612103" watchObservedRunningTime="2026-02-16 21:40:36.760387605 +0000 UTC m=+157.342888707" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.770578 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.770943 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.270927052 +0000 UTC m=+157.853428154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.872280 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.872884 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.372856522 +0000 UTC m=+157.955357624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.872992 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.873390 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.3733812 +0000 UTC m=+157.955882302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.884886 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" podStartSLOduration=134.884865889 podStartE2EDuration="2m14.884865889s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:36.873180163 +0000 UTC m=+157.455681265" watchObservedRunningTime="2026-02-16 21:40:36.884865889 +0000 UTC m=+157.467366991" Feb 16 21:40:36 crc kubenswrapper[4777]: I0216 21:40:36.977604 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:36 crc kubenswrapper[4777]: E0216 21:40:36.978069 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.478052173 +0000 UTC m=+158.060553275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.078845 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.079197 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.579183676 +0000 UTC m=+158.161684778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.179803 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.180147 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.680126472 +0000 UTC m=+158.262627564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.243359 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:37 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:37 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:37 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.243423 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.281540 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.281886 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.781871136 +0000 UTC m=+158.364372238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.382752 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.383076 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.88303284 +0000 UTC m=+158.465533942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.383328 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.383683 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.883667932 +0000 UTC m=+158.466169034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.398749 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" event={"ID":"3153ee9e-fa94-4bb7-8506-16778984382b","Type":"ContainerStarted","Data":"5f5f050889337b94b7aba0c3f0e894efacad6c51863d7128a023283bdba61b6a"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.400562 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" event={"ID":"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7","Type":"ContainerStarted","Data":"cf64f2dcbb9a1b46459beb46f844df64bdb2e5b9ef54a2b93e2f0fe0edd599f0"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.400586 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" event={"ID":"365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7","Type":"ContainerStarted","Data":"abb201d097ca62918c222175ef9bd3f0331c218b20921b1244d8c78b71c22f5b"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.416269 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" event={"ID":"0cf641fa-c2ae-4d0a-b7c1-b9f20fa7f346","Type":"ContainerStarted","Data":"92683db09c514ea83ad6e1c2311386d638d22eadbbaba5cb0297d15ac0adbf67"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.428449 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" event={"ID":"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6","Type":"ContainerStarted","Data":"caca724bc76096e468bd0f26eefdd4710487dac3d5e611757b60608212ae40f7"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.438116 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bwxqt" event={"ID":"9fee359b-721a-4ae4-96fd-1edd2b309712","Type":"ContainerStarted","Data":"1c94e6521442559fa3ab3fe256bed0482791a0d29281c444405b4408177db1a0"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.440871 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzg58" event={"ID":"263c0239-456c-4076-ae02-eb2a4abc33e4","Type":"ContainerStarted","Data":"419f4bd6cff1e54c3c5e9da8d78180caf32e6337b58e78713d847aec569afe97"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.444014 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c9g4b" event={"ID":"efb9114f-122a-4b76-8763-4dd9dd908707","Type":"ContainerStarted","Data":"b41d7f7a697c3017d5a9723cb55bdeca0811f19dcebbc23f6cec1ca65337903f"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.448468 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tqjpt" event={"ID":"4af6ea3c-7da8-465d-b056-9b9f0a67f626","Type":"ContainerStarted","Data":"e6ae801f67d91701f3da7bddd9a09c0e548823269d524d9368b52baa6bd7c234"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.452648 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" podStartSLOduration=135.452626166 podStartE2EDuration="2m15.452626166s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:37.450557456 +0000 UTC m=+158.033058558" watchObservedRunningTime="2026-02-16 21:40:37.452626166 +0000 UTC m=+158.035127268" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.453756 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnrgl" event={"ID":"90968a16-faa8-4ec7-a565-4d3cb75ed1dd","Type":"ContainerStarted","Data":"22b4a408316933432ed1536bddc921c983035cd7f6ce2275a2621b61a981f583"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.458319 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" event={"ID":"8b1dfd22-6fa0-4096-97e4-871e6b65ad30","Type":"ContainerStarted","Data":"5716bb153257d11a420a2fe1110489053c97c36043f55add311e6f27670cda65"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.460491 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-87dv2" event={"ID":"da00f555-fb9c-4329-b263-5d1e3c31180e","Type":"ContainerStarted","Data":"4c90cc25bf56524caca5414726d7894d177e6fa8f07be78c56237ff4f8b386e8"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.466970 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" event={"ID":"d6810abc-8d3e-49d2-8a37-c52a22aee915","Type":"ContainerStarted","Data":"9aa0762ed5ccc82423de8f6cc26fb21cae5dc079702d751bc62565465b9498b2"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.469087 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg47z" event={"ID":"93605ec1-16ef-4ffc-aa50-8096ea8e8acc","Type":"ContainerStarted","Data":"6ee064ff9fca851859c629f886298e642bb7e2b56f5bc1a49b45a8341c274a24"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.471601 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wr74s" event={"ID":"489bab1c-472b-4904-b4bb-3dc341f48f27","Type":"ContainerStarted","Data":"54cfea3a9009cd5a024034439d805f04481a0672f32ce95f6723fe788474c013"} Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.482223 4777 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfdd6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.482297 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.482826 4777 patch_prober.go:28] interesting pod/downloads-7954f5f757-phc5s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.483020 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-phc5s" podUID="29e22b2d-4f29-4258-8b81-31adeae29a3c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.484981 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.485097 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.985075284 +0000 UTC m=+158.567576386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.486475 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.487003 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:37.986993919 +0000 UTC m=+158.569495121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.505400 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xrkzz" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.510361 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.510761 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.512448 4777 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kgdkp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.512502 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" podUID="365ca9d0-1c4e-43ab-8eaa-0db45cf02fb7" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.540397 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4xwfd" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.587913 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.588145 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.088113412 +0000 UTC m=+158.670614514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.591084 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.593099 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" podStartSLOduration=136.59307574 podStartE2EDuration="2m16.59307574s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:37.530411579 +0000 UTC m=+158.112912681" watchObservedRunningTime="2026-02-16 21:40:37.59307574 +0000 UTC m=+158.175576842" Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.593347 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kpqz8" podStartSLOduration=135.593340209 podStartE2EDuration="2m15.593340209s" podCreationTimestamp="2026-02-16 21:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:37.572462522 +0000 UTC m=+158.154963624" watchObservedRunningTime="2026-02-16 21:40:37.593340209 +0000 UTC m=+158.175841301" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.610358 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.110331194 +0000 UTC m=+158.692832296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.693568 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.693966 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.193944954 +0000 UTC m=+158.776446056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.795615 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.796126 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.296102842 +0000 UTC m=+158.878603944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:37 crc kubenswrapper[4777]: I0216 21:40:37.896402 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:37 crc kubenswrapper[4777]: E0216 21:40:37.896799 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.396772359 +0000 UTC m=+158.979273451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:37.999403 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:37.999963 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.499948701 +0000 UTC m=+159.082449803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.101974 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.102451 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.60243043 +0000 UTC m=+159.184931532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.170949 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x8jn4" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.203226 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.203677 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.703654446 +0000 UTC m=+159.286155548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.241984 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:38 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:38 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:38 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.242051 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.304032 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.304244 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.804207419 +0000 UTC m=+159.386708521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.304462 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.304857 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.804841551 +0000 UTC m=+159.387342653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.378642 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6wbqg" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.406040 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.406254 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.906219342 +0000 UTC m=+159.488720444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.406354 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.406694 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:38.906677848 +0000 UTC m=+159.489178950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.503257 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" event={"ID":"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6","Type":"ContainerStarted","Data":"b6c1c44bd9f54415f73680b4d7fa8e05d24f61d3d09fb07396c9cab8c0897fef"} Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.503323 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" event={"ID":"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6","Type":"ContainerStarted","Data":"cdffaa2b26c0bdae6e774c353bbd4df399642f576fcdba3ff3d035f9b43fc1b4"} Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.507643 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.507881 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.007848462 +0000 UTC m=+159.590349564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.508194 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.508736 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.008725492 +0000 UTC m=+159.591226594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.519243 4777 generic.go:334] "Generic (PLEG): container finished" podID="4eadd010-ae18-4453-8810-1f8edf434cb3" containerID="b652e68a557e6bf036b7d7c10e707aaeac80c326803214742d609c3164d2a186" exitCode=0 Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.519326 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" event={"ID":"4eadd010-ae18-4453-8810-1f8edf434cb3","Type":"ContainerDied","Data":"b652e68a557e6bf036b7d7c10e707aaeac80c326803214742d609c3164d2a186"} Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.520937 4777 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfdd6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.521004 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.609259 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.609529 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.109492243 +0000 UTC m=+159.691993345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.610134 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.611043 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.111025524 +0000 UTC m=+159.693526626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.711503 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.711753 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.211703702 +0000 UTC m=+159.794204794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.711789 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.712190 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.212182798 +0000 UTC m=+159.794683970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.813183 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.813578 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.313563469 +0000 UTC m=+159.896064571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.849203 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.850272 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.855306 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.865137 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.914484 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.914611 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.914639 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:38 crc kubenswrapper[4777]: I0216 21:40:38.914766 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25tx2\" (UniqueName: \"kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:38 crc kubenswrapper[4777]: E0216 21:40:38.914958 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.41493717 +0000 UTC m=+159.997438272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.016459 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.016789 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25tx2\" (UniqueName: \"kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.016872 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.016901 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.017382 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.017461 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.5174466 +0000 UTC m=+160.099947702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.017980 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.028972 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.033824 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.049539 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.063209 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.075605 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25tx2\" (UniqueName: \"kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2\") pod \"community-operators-wzxjq\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.118766 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.118827 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.118870 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lw8n\" (UniqueName: \"kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.118899 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.119266 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.619253096 +0000 UTC m=+160.201754388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.167137 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.219773 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.219928 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.719894022 +0000 UTC m=+160.302395124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.220452 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.220488 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.220530 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.220550 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lw8n\" (UniqueName: \"kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.221143 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.221308 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.221766 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.721749125 +0000 UTC m=+160.304250227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.235074 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.236449 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.250232 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.253368 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:39 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:39 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:39 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.253582 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.254790 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lw8n\" (UniqueName: \"kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n\") pod \"certified-operators-nrsdv\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.321211 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.321414 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpk6\" (UniqueName: \"kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.321467 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.321505 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.321646 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.821625205 +0000 UTC m=+160.404126307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.351550 4777 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.374081 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.423044 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpk6\" (UniqueName: \"kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.423091 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.423122 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.423162 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.423649 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.424288 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-16 21:40:39.92427452 +0000 UTC m=+160.506775622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gdkjm" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.424626 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.447924 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.448966 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.479212 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.485572 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpk6\" (UniqueName: \"kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6\") pod \"community-operators-kl64h\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.524292 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.524501 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbvk\" (UniqueName: \"kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.524554 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.524584 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: E0216 21:40:39.524736 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-16 21:40:40.024700529 +0000 UTC m=+160.607201631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.531690 4777 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-16T21:40:39.351588589Z","Handler":null,"Name":""} Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.535500 4777 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.535540 4777 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.560233 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.571097 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" event={"ID":"b9b2ef41-7897-46c3-bfe5-5eddf6aba7c6","Type":"ContainerStarted","Data":"b9cb661fd0bd75ef1c6802fef40ee9196a5e48fa7e1d803912ec373dd537d794"} Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.610589 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-j4h22" podStartSLOduration=10.610566045 podStartE2EDuration="10.610566045s" podCreationTimestamp="2026-02-16 21:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:39.610202983 +0000 UTC m=+160.192704075" watchObservedRunningTime="2026-02-16 21:40:39.610566045 +0000 UTC m=+160.193067137" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.627115 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbvk\" (UniqueName: \"kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.627198 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.627977 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.628084 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.631646 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.631652 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.651532 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.651601 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.664260 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbvk\" (UniqueName: \"kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk\") pod \"certified-operators-9wtvs\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.689843 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.733021 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gdkjm\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.798931 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.834312 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.860056 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.874387 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:40:39 crc kubenswrapper[4777]: I0216 21:40:39.957180 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.117356 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.141327 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 21:40:40 crc kubenswrapper[4777]: E0216 21:40:40.141644 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eadd010-ae18-4453-8810-1f8edf434cb3" containerName="collect-profiles" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.141658 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eadd010-ae18-4453-8810-1f8edf434cb3" containerName="collect-profiles" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.141768 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eadd010-ae18-4453-8810-1f8edf434cb3" containerName="collect-profiles" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.144786 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.153088 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.153171 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.153409 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.165871 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume\") pod \"4eadd010-ae18-4453-8810-1f8edf434cb3\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.165914 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkvkj\" (UniqueName: \"kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj\") pod \"4eadd010-ae18-4453-8810-1f8edf434cb3\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.166023 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume\") pod \"4eadd010-ae18-4453-8810-1f8edf434cb3\" (UID: \"4eadd010-ae18-4453-8810-1f8edf434cb3\") " Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.175960 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume" (OuterVolumeSpecName: "config-volume") pod "4eadd010-ae18-4453-8810-1f8edf434cb3" (UID: "4eadd010-ae18-4453-8810-1f8edf434cb3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.182680 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj" (OuterVolumeSpecName: "kube-api-access-qkvkj") pod "4eadd010-ae18-4453-8810-1f8edf434cb3" (UID: "4eadd010-ae18-4453-8810-1f8edf434cb3"). InnerVolumeSpecName "kube-api-access-qkvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.194636 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.204525 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4eadd010-ae18-4453-8810-1f8edf434cb3" (UID: "4eadd010-ae18-4453-8810-1f8edf434cb3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.242542 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:40 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:40 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:40 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.242607 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.267966 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.268098 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.268218 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eadd010-ae18-4453-8810-1f8edf434cb3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.268234 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkvkj\" (UniqueName: \"kubernetes.io/projected/4eadd010-ae18-4453-8810-1f8edf434cb3-kube-api-access-qkvkj\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.268247 4777 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eadd010-ae18-4453-8810-1f8edf434cb3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.341394 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:40:40 crc kubenswrapper[4777]: W0216 21:40:40.347153 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b1eaca3_9a44_4887_98cf_7cb2a4d406b7.slice/crio-0fe8b3477264fbb91cc8d1d6dce0112d81cffe2b2e1be543d1489fe18ee68e20 WatchSource:0}: Error finding container 0fe8b3477264fbb91cc8d1d6dce0112d81cffe2b2e1be543d1489fe18ee68e20: Status 404 returned error can't find the container with id 0fe8b3477264fbb91cc8d1d6dce0112d81cffe2b2e1be543d1489fe18ee68e20 Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.369762 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.369842 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.369914 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.390807 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.393687 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:40:40 crc kubenswrapper[4777]: W0216 21:40:40.395869 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaaf3fb4_0bfd_4a29_aebe_d364790f620b.slice/crio-7d98676c47149ba10df8d75b1df5ab30249cd49e4125bb00928fbd4c32195eba WatchSource:0}: Error finding container 7d98676c47149ba10df8d75b1df5ab30249cd49e4125bb00928fbd4c32195eba: Status 404 returned error can't find the container with id 7d98676c47149ba10df8d75b1df5ab30249cd49e4125bb00928fbd4c32195eba Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.441441 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.583634 4777 generic.go:334] "Generic (PLEG): container finished" podID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerID="d96bf99acf21c9d55f56f625c5a9b34a6b9e35ffd110289f202824088fbf59c2" exitCode=0 Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.583697 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerDied","Data":"d96bf99acf21c9d55f56f625c5a9b34a6b9e35ffd110289f202824088fbf59c2"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.584052 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerStarted","Data":"8ded80d680336ca291a68e69af2b1599d779faeb1610c0fa656b51e88f640f91"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.585449 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.586240 4777 generic.go:334] "Generic (PLEG): container finished" podID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerID="5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892" exitCode=0 Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.586309 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerDied","Data":"5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.586351 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerStarted","Data":"0fe8b3477264fbb91cc8d1d6dce0112d81cffe2b2e1be543d1489fe18ee68e20"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.590081 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" event={"ID":"4eadd010-ae18-4453-8810-1f8edf434cb3","Type":"ContainerDied","Data":"e426f29730b689a9b57a0428c64699d4b5deedc47b96ebc7c2d58bc870308487"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.590113 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e426f29730b689a9b57a0428c64699d4b5deedc47b96ebc7c2d58bc870308487" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.590138 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.591829 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerStarted","Data":"65262afdd2f30c358f2d7ba58414d3a746bbdf35645ef2f63e9e77158eccdd9b"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.592943 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" event={"ID":"aaaf3fb4-0bfd-4a29-aebe-d364790f620b","Type":"ContainerStarted","Data":"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.592972 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" event={"ID":"aaaf3fb4-0bfd-4a29-aebe-d364790f620b","Type":"ContainerStarted","Data":"7d98676c47149ba10df8d75b1df5ab30249cd49e4125bb00928fbd4c32195eba"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.593532 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.597417 4777 generic.go:334] "Generic (PLEG): container finished" podID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerID="b56598416149d2334df348d32552f0140ca772c58d9ab50ee4d8ff5311eb3f84" exitCode=0 Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.598590 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerDied","Data":"b56598416149d2334df348d32552f0140ca772c58d9ab50ee4d8ff5311eb3f84"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.598618 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerStarted","Data":"9f4b9ce61816a48a4a2c6e7374a733caf05a9861f6e5da740c10f5820a2840ac"} Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.604704 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.626418 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" podStartSLOduration=139.626396498 podStartE2EDuration="2m19.626396498s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:40.622842368 +0000 UTC m=+161.205343460" watchObservedRunningTime="2026-02-16 21:40:40.626396498 +0000 UTC m=+161.208897600" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.704680 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.705779 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.710600 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.714201 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.716204 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.777971 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.778067 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.877509 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.879428 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.879529 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.879688 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:40 crc kubenswrapper[4777]: I0216 21:40:40.903228 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.025172 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.026472 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.028018 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.030734 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.036494 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.081946 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.082254 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjwv\" (UniqueName: \"kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.082342 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.182986 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.183056 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.183087 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjwv\" (UniqueName: \"kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.183573 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.183679 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.204909 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjwv\" (UniqueName: \"kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv\") pod \"redhat-marketplace-k68t2\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.227324 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.241861 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:41 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:41 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:41 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.241935 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.347314 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.432313 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.437516 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.437640 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.495497 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.495779 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhx9b\" (UniqueName: \"kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.496020 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.597780 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.597873 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.597909 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhx9b\" (UniqueName: \"kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.599450 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.599964 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.606296 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.625874 4777 generic.go:334] "Generic (PLEG): container finished" podID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerID="5749f21752d6705fe64765fa47ace21e3ebd93b9c77ffc1f4aa55e4053dede73" exitCode=0 Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.626039 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerDied","Data":"5749f21752d6705fe64765fa47ace21e3ebd93b9c77ffc1f4aa55e4053dede73"} Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.628840 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhx9b\" (UniqueName: \"kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b\") pod \"redhat-marketplace-mv49x\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.630487 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"541fef7b-2b93-40ee-a252-400185f4a568","Type":"ContainerStarted","Data":"269c513177bc5452c10a3a8bbb2371a51f3f3d3c14e7d1f3f5c609199cd8f4bf"} Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.633353 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d166e84e-305d-4151-b60b-dcbb77718e09","Type":"ContainerStarted","Data":"64064db56401964a04b472ae77b21b393d875c166420cfc09bf4c82c91950f5a"} Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.633397 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d166e84e-305d-4151-b60b-dcbb77718e09","Type":"ContainerStarted","Data":"c574f28755bac1dd44e62bb4d61f1471a292c90411550379a00eac15676e3cf3"} Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.651375 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.651429 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.662232 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.6622158580000002 podStartE2EDuration="1.662215858s" podCreationTimestamp="2026-02-16 21:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:41.655379206 +0000 UTC m=+162.237880308" watchObservedRunningTime="2026-02-16 21:40:41.662215858 +0000 UTC m=+162.244716960" Feb 16 21:40:41 crc kubenswrapper[4777]: W0216 21:40:41.664445 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c46266_f08a_405d_8768_8075a15bb61d.slice/crio-c591359e258c564ecbce36f8db609ee9fde356b998ccd3d5767b222c8d6633fc WatchSource:0}: Error finding container c591359e258c564ecbce36f8db609ee9fde356b998ccd3d5767b222c8d6633fc: Status 404 returned error can't find the container with id c591359e258c564ecbce36f8db609ee9fde356b998ccd3d5767b222c8d6633fc Feb 16 21:40:41 crc kubenswrapper[4777]: I0216 21:40:41.811392 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.030393 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.042786 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.042890 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.048684 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.111502 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2bq5\" (UniqueName: \"kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.111639 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.111666 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.139294 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:40:42 crc kubenswrapper[4777]: W0216 21:40:42.162309 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9ca7b4_abe8_480b_8d82_a44eea03d22e.slice/crio-eb34482ee5f3b8f5b281d9f1f4a332036df0802ad43df813e45bdb6321e713c1 WatchSource:0}: Error finding container eb34482ee5f3b8f5b281d9f1f4a332036df0802ad43df813e45bdb6321e713c1: Status 404 returned error can't find the container with id eb34482ee5f3b8f5b281d9f1f4a332036df0802ad43df813e45bdb6321e713c1 Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.162444 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.162804 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.173219 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.214009 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.214112 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.214180 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bq5\" (UniqueName: \"kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.215349 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.216558 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.232431 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.233515 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.238075 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.243059 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:42 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:42 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:42 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.243109 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.247693 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2bq5\" (UniqueName: \"kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5\") pod \"redhat-operators-b76h6\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.248559 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.299070 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.299137 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.302739 4777 patch_prober.go:28] interesting pod/console-f9d7485db-rxnqn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.302875 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxnqn" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.315950 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr27j\" (UniqueName: \"kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.316000 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.316031 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.373150 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.417439 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr27j\" (UniqueName: \"kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.417498 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.417531 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.418165 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.418768 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.448234 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr27j\" (UniqueName: \"kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j\") pod \"redhat-operators-xbfdc\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.486147 4777 patch_prober.go:28] interesting pod/downloads-7954f5f757-phc5s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.486209 4777 patch_prober.go:28] interesting pod/downloads-7954f5f757-phc5s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.486224 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-phc5s" podUID="29e22b2d-4f29-4258-8b81-31adeae29a3c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.486311 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-phc5s" podUID="29e22b2d-4f29-4258-8b81-31adeae29a3c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.516755 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.522286 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kgdkp" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.652032 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.670951 4777 generic.go:334] "Generic (PLEG): container finished" podID="01c46266-f08a-405d-8768-8075a15bb61d" containerID="57fa2c12ec0b92da7d1de7cfe3dd24ffb17e03ce32e657f18e50f86a5ae937ca" exitCode=0 Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.671072 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerDied","Data":"57fa2c12ec0b92da7d1de7cfe3dd24ffb17e03ce32e657f18e50f86a5ae937ca"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.671107 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerStarted","Data":"c591359e258c564ecbce36f8db609ee9fde356b998ccd3d5767b222c8d6633fc"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.680001 4777 generic.go:334] "Generic (PLEG): container finished" podID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerID="b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4" exitCode=0 Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.680973 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerDied","Data":"b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.681098 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerStarted","Data":"eb34482ee5f3b8f5b281d9f1f4a332036df0802ad43df813e45bdb6321e713c1"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.694555 4777 generic.go:334] "Generic (PLEG): container finished" podID="541fef7b-2b93-40ee-a252-400185f4a568" containerID="83856c4b0366e411564ffbc38eccd64fcc201b849e36a125daa242e8f65a1067" exitCode=0 Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.694656 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"541fef7b-2b93-40ee-a252-400185f4a568","Type":"ContainerDied","Data":"83856c4b0366e411564ffbc38eccd64fcc201b849e36a125daa242e8f65a1067"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.735929 4777 generic.go:334] "Generic (PLEG): container finished" podID="d166e84e-305d-4151-b60b-dcbb77718e09" containerID="64064db56401964a04b472ae77b21b393d875c166420cfc09bf4c82c91950f5a" exitCode=0 Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.744968 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d166e84e-305d-4151-b60b-dcbb77718e09","Type":"ContainerDied","Data":"64064db56401964a04b472ae77b21b393d875c166420cfc09bf4c82c91950f5a"} Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.757634 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9wjm2" Feb 16 21:40:42 crc kubenswrapper[4777]: I0216 21:40:42.830918 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.014070 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.165469 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:40:43 crc kubenswrapper[4777]: W0216 21:40:43.224436 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f1ee11_a9b8_4398_a964_04d976458836.slice/crio-28225ecf9d9fe59ac1baa23320e720604f4582fb6e7d2112c5736dfaa97826c8 WatchSource:0}: Error finding container 28225ecf9d9fe59ac1baa23320e720604f4582fb6e7d2112c5736dfaa97826c8: Status 404 returned error can't find the container with id 28225ecf9d9fe59ac1baa23320e720604f4582fb6e7d2112c5736dfaa97826c8 Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.241629 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:43 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:43 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:43 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.241696 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.764084 4777 generic.go:334] "Generic (PLEG): container finished" podID="c6f1ee11-a9b8-4398-a964-04d976458836" containerID="88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8" exitCode=0 Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.764196 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerDied","Data":"88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8"} Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.764572 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerStarted","Data":"28225ecf9d9fe59ac1baa23320e720604f4582fb6e7d2112c5736dfaa97826c8"} Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.780264 4777 generic.go:334] "Generic (PLEG): container finished" podID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerID="bdbb6a62c747450ddad161ecb3243d32ca48d2bd1e017dc59cbbcb6df8da4280" exitCode=0 Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.780462 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerDied","Data":"bdbb6a62c747450ddad161ecb3243d32ca48d2bd1e017dc59cbbcb6df8da4280"} Feb 16 21:40:43 crc kubenswrapper[4777]: I0216 21:40:43.780511 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerStarted","Data":"7ffd6aecd1e19ec403c8d9a24358d3706a9d0705116aa74bb197776a88d3ff71"} Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.023670 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.085507 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170015 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access\") pod \"541fef7b-2b93-40ee-a252-400185f4a568\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170358 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir\") pod \"541fef7b-2b93-40ee-a252-400185f4a568\" (UID: \"541fef7b-2b93-40ee-a252-400185f4a568\") " Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170379 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir\") pod \"d166e84e-305d-4151-b60b-dcbb77718e09\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170502 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "541fef7b-2b93-40ee-a252-400185f4a568" (UID: "541fef7b-2b93-40ee-a252-400185f4a568"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170610 4777 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541fef7b-2b93-40ee-a252-400185f4a568-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.170621 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d166e84e-305d-4151-b60b-dcbb77718e09" (UID: "d166e84e-305d-4151-b60b-dcbb77718e09"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.185287 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "541fef7b-2b93-40ee-a252-400185f4a568" (UID: "541fef7b-2b93-40ee-a252-400185f4a568"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.242194 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:44 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:44 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:44 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.242263 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.271500 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access\") pod \"d166e84e-305d-4151-b60b-dcbb77718e09\" (UID: \"d166e84e-305d-4151-b60b-dcbb77718e09\") " Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.272048 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/541fef7b-2b93-40ee-a252-400185f4a568-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.272066 4777 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d166e84e-305d-4151-b60b-dcbb77718e09-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.288588 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d166e84e-305d-4151-b60b-dcbb77718e09" (UID: "d166e84e-305d-4151-b60b-dcbb77718e09"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.372816 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d166e84e-305d-4151-b60b-dcbb77718e09-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.439498 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lzg58" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.575551 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.580511 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1d5abcd-6e58-4563-98c6-0adb808ed0a7-metrics-certs\") pod \"network-metrics-daemon-rwm84\" (UID: \"a1d5abcd-6e58-4563-98c6-0adb808ed0a7\") " pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.748626 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwm84" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.844947 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"541fef7b-2b93-40ee-a252-400185f4a568","Type":"ContainerDied","Data":"269c513177bc5452c10a3a8bbb2371a51f3f3d3c14e7d1f3f5c609199cd8f4bf"} Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.844995 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269c513177bc5452c10a3a8bbb2371a51f3f3d3c14e7d1f3f5c609199cd8f4bf" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.844967 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.860370 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.860363 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d166e84e-305d-4151-b60b-dcbb77718e09","Type":"ContainerDied","Data":"c574f28755bac1dd44e62bb4d61f1471a292c90411550379a00eac15676e3cf3"} Feb 16 21:40:44 crc kubenswrapper[4777]: I0216 21:40:44.860448 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c574f28755bac1dd44e62bb4d61f1471a292c90411550379a00eac15676e3cf3" Feb 16 21:40:45 crc kubenswrapper[4777]: I0216 21:40:45.182076 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwm84"] Feb 16 21:40:45 crc kubenswrapper[4777]: W0216 21:40:45.193788 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d5abcd_6e58_4563_98c6_0adb808ed0a7.slice/crio-9cac6b110d1c6fd3192572ff9983a1801807a42aaadfe3aee441ae49e9501c84 WatchSource:0}: Error finding container 9cac6b110d1c6fd3192572ff9983a1801807a42aaadfe3aee441ae49e9501c84: Status 404 returned error can't find the container with id 9cac6b110d1c6fd3192572ff9983a1801807a42aaadfe3aee441ae49e9501c84 Feb 16 21:40:45 crc kubenswrapper[4777]: I0216 21:40:45.242330 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:45 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:45 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:45 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:45 crc kubenswrapper[4777]: I0216 21:40:45.242397 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:45 crc kubenswrapper[4777]: I0216 21:40:45.876340 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwm84" event={"ID":"a1d5abcd-6e58-4563-98c6-0adb808ed0a7","Type":"ContainerStarted","Data":"9cac6b110d1c6fd3192572ff9983a1801807a42aaadfe3aee441ae49e9501c84"} Feb 16 21:40:46 crc kubenswrapper[4777]: I0216 21:40:46.241979 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:46 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:46 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:46 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:46 crc kubenswrapper[4777]: I0216 21:40:46.242353 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:46 crc kubenswrapper[4777]: I0216 21:40:46.896461 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwm84" event={"ID":"a1d5abcd-6e58-4563-98c6-0adb808ed0a7","Type":"ContainerStarted","Data":"311427ccac1404cff987e4b9189505b7173887b4f234b6e9f69d650f2d0049e7"} Feb 16 21:40:47 crc kubenswrapper[4777]: I0216 21:40:47.242564 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:47 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:47 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:47 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:47 crc kubenswrapper[4777]: I0216 21:40:47.242632 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:47 crc kubenswrapper[4777]: I0216 21:40:47.913194 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwm84" event={"ID":"a1d5abcd-6e58-4563-98c6-0adb808ed0a7","Type":"ContainerStarted","Data":"8dfb5c3b15c16d457566087b9c4b4a92671c976d34d12be1a1cdb76b45339869"} Feb 16 21:40:47 crc kubenswrapper[4777]: I0216 21:40:47.935813 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rwm84" podStartSLOduration=146.935792739 podStartE2EDuration="2m26.935792739s" podCreationTimestamp="2026-02-16 21:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:40:47.93315451 +0000 UTC m=+168.515655622" watchObservedRunningTime="2026-02-16 21:40:47.935792739 +0000 UTC m=+168.518293841" Feb 16 21:40:48 crc kubenswrapper[4777]: I0216 21:40:48.240994 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:48 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:48 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:48 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:48 crc kubenswrapper[4777]: I0216 21:40:48.241066 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:49 crc kubenswrapper[4777]: I0216 21:40:49.241510 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:49 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:49 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:49 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:49 crc kubenswrapper[4777]: I0216 21:40:49.241596 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:50 crc kubenswrapper[4777]: I0216 21:40:50.240926 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:50 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:50 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:50 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:50 crc kubenswrapper[4777]: I0216 21:40:50.240992 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:51 crc kubenswrapper[4777]: I0216 21:40:51.240751 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:51 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:51 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:51 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:51 crc kubenswrapper[4777]: I0216 21:40:51.240830 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:52 crc kubenswrapper[4777]: I0216 21:40:52.243636 4777 patch_prober.go:28] interesting pod/router-default-5444994796-l29zb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 16 21:40:52 crc kubenswrapper[4777]: [-]has-synced failed: reason withheld Feb 16 21:40:52 crc kubenswrapper[4777]: [+]process-running ok Feb 16 21:40:52 crc kubenswrapper[4777]: healthz check failed Feb 16 21:40:52 crc kubenswrapper[4777]: I0216 21:40:52.244058 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l29zb" podUID="1457b00a-86eb-457e-99a3-bf4bb271c513" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 16 21:40:52 crc kubenswrapper[4777]: I0216 21:40:52.299657 4777 patch_prober.go:28] interesting pod/console-f9d7485db-rxnqn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 16 21:40:52 crc kubenswrapper[4777]: I0216 21:40:52.299740 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxnqn" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 16 21:40:52 crc kubenswrapper[4777]: I0216 21:40:52.493195 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-phc5s" Feb 16 21:40:53 crc kubenswrapper[4777]: I0216 21:40:53.240908 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:53 crc kubenswrapper[4777]: I0216 21:40:53.243062 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-l29zb" Feb 16 21:40:58 crc kubenswrapper[4777]: I0216 21:40:58.556470 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 16 21:40:59 crc kubenswrapper[4777]: I0216 21:40:59.875337 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:41:02 crc kubenswrapper[4777]: I0216 21:41:02.306468 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:41:02 crc kubenswrapper[4777]: I0216 21:41:02.313433 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:41:07 crc kubenswrapper[4777]: E0216 21:41:07.776739 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 16 21:41:07 crc kubenswrapper[4777]: E0216 21:41:07.777440 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhx9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mv49x_openshift-marketplace(1e9ca7b4-abe8-480b-8d82-a44eea03d22e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:07 crc kubenswrapper[4777]: E0216 21:41:07.779186 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mv49x" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.588767 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mv49x" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" Feb 16 21:41:11 crc kubenswrapper[4777]: I0216 21:41:11.651187 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:41:11 crc kubenswrapper[4777]: I0216 21:41:11.651448 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.665696 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.665984 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxjwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k68t2_openshift-marketplace(01c46266-f08a-405d-8768-8075a15bb61d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.667226 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k68t2" podUID="01c46266-f08a-405d-8768-8075a15bb61d" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.672526 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.672947 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xr27j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xbfdc_openshift-marketplace(c6f1ee11-a9b8-4398-a964-04d976458836): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.674440 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xbfdc" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.701943 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.702165 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25tx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wzxjq_openshift-marketplace(a6da8337-cf15-4ee9-a4f9-c7047aad3cdf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:11 crc kubenswrapper[4777]: E0216 21:41:11.703631 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wzxjq" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" Feb 16 21:41:13 crc kubenswrapper[4777]: I0216 21:41:13.387799 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9d2t" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.452036 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xbfdc" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.452064 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k68t2" podUID="01c46266-f08a-405d-8768-8075a15bb61d" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.452064 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wzxjq" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.531142 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.532071 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lw8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nrsdv_openshift-marketplace(93764402-52dc-48ce-9352-bd7982f7f0d6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.533343 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nrsdv" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.543835 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.544020 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2bq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-b76h6_openshift-marketplace(f0761e7f-6d18-4832-be21-89d3415e7b21): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.545637 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-b76h6" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.572610 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.572774 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgbvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wtvs_openshift-marketplace(950a7269-d4a8-4d38-b02f-c41037ef0341): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 16 21:41:13 crc kubenswrapper[4777]: E0216 21:41:13.574049 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9wtvs" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" Feb 16 21:41:14 crc kubenswrapper[4777]: I0216 21:41:14.094870 4777 generic.go:334] "Generic (PLEG): container finished" podID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerID="0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667" exitCode=0 Feb 16 21:41:14 crc kubenswrapper[4777]: I0216 21:41:14.095022 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerDied","Data":"0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667"} Feb 16 21:41:14 crc kubenswrapper[4777]: E0216 21:41:14.097410 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wtvs" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" Feb 16 21:41:14 crc kubenswrapper[4777]: E0216 21:41:14.097725 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-b76h6" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" Feb 16 21:41:14 crc kubenswrapper[4777]: E0216 21:41:14.097918 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nrsdv" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" Feb 16 21:41:15 crc kubenswrapper[4777]: I0216 21:41:15.103000 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerStarted","Data":"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00"} Feb 16 21:41:15 crc kubenswrapper[4777]: I0216 21:41:15.146147 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kl64h" podStartSLOduration=2.215473987 podStartE2EDuration="36.146116745s" podCreationTimestamp="2026-02-16 21:40:39 +0000 UTC" firstStartedPulling="2026-02-16 21:40:40.588137853 +0000 UTC m=+161.170638955" lastFinishedPulling="2026-02-16 21:41:14.518780601 +0000 UTC m=+195.101281713" observedRunningTime="2026-02-16 21:41:15.139129108 +0000 UTC m=+195.721630260" watchObservedRunningTime="2026-02-16 21:41:15.146116745 +0000 UTC m=+195.728617857" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.696050 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 21:41:18 crc kubenswrapper[4777]: E0216 21:41:18.696680 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d166e84e-305d-4151-b60b-dcbb77718e09" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.696751 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d166e84e-305d-4151-b60b-dcbb77718e09" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: E0216 21:41:18.696774 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541fef7b-2b93-40ee-a252-400185f4a568" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.696781 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="541fef7b-2b93-40ee-a252-400185f4a568" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.696892 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="541fef7b-2b93-40ee-a252-400185f4a568" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.696901 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="d166e84e-305d-4151-b60b-dcbb77718e09" containerName="pruner" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.697318 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.704609 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.705016 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.711627 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.810015 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.810104 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.912042 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.912128 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.912204 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:18 crc kubenswrapper[4777]: I0216 21:41:18.931704 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:19 crc kubenswrapper[4777]: I0216 21:41:19.030332 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:19 crc kubenswrapper[4777]: I0216 21:41:19.561023 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:19 crc kubenswrapper[4777]: I0216 21:41:19.561377 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:19 crc kubenswrapper[4777]: I0216 21:41:19.571547 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 16 21:41:19 crc kubenswrapper[4777]: I0216 21:41:19.868346 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:20 crc kubenswrapper[4777]: I0216 21:41:20.135911 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0","Type":"ContainerStarted","Data":"8b66d447e2a83dcead943f1cf04d7f69b6b150923e9385c0ba3b1728af60b8c6"} Feb 16 21:41:20 crc kubenswrapper[4777]: I0216 21:41:20.136243 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0","Type":"ContainerStarted","Data":"583e9a10b51b19a8c6c16de3ef6cddddd8ed5d385e14728a4ed6fc226087f63b"} Feb 16 21:41:20 crc kubenswrapper[4777]: I0216 21:41:20.152555 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.152535071 podStartE2EDuration="2.152535071s" podCreationTimestamp="2026-02-16 21:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:41:20.148885765 +0000 UTC m=+200.731386877" watchObservedRunningTime="2026-02-16 21:41:20.152535071 +0000 UTC m=+200.735036173" Feb 16 21:41:20 crc kubenswrapper[4777]: I0216 21:41:20.175767 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:20 crc kubenswrapper[4777]: I0216 21:41:20.629563 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:41:21 crc kubenswrapper[4777]: I0216 21:41:21.151968 4777 generic.go:334] "Generic (PLEG): container finished" podID="b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" containerID="8b66d447e2a83dcead943f1cf04d7f69b6b150923e9385c0ba3b1728af60b8c6" exitCode=0 Feb 16 21:41:21 crc kubenswrapper[4777]: I0216 21:41:21.152201 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0","Type":"ContainerDied","Data":"8b66d447e2a83dcead943f1cf04d7f69b6b150923e9385c0ba3b1728af60b8c6"} Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.157237 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kl64h" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="registry-server" containerID="cri-o://c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00" gracePeriod=2 Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.494629 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.501146 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.585989 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir\") pod \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.586088 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" (UID: "b8a62d27-1d28-4ba4-8f2e-b4d2607aade0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.586167 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access\") pod \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\" (UID: \"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0\") " Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.586834 4777 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.592269 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" (UID: "b8a62d27-1d28-4ba4-8f2e-b4d2607aade0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.687391 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities\") pod \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.687510 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfpk6\" (UniqueName: \"kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6\") pod \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.687614 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content\") pod \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\" (UID: \"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7\") " Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.687936 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8a62d27-1d28-4ba4-8f2e-b4d2607aade0-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.689374 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities" (OuterVolumeSpecName: "utilities") pod "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" (UID: "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.690746 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6" (OuterVolumeSpecName: "kube-api-access-mfpk6") pod "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" (UID: "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7"). InnerVolumeSpecName "kube-api-access-mfpk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.742372 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" (UID: "4b1eaca3-9a44-4887-98cf-7cb2a4d406b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.789010 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.789036 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:22 crc kubenswrapper[4777]: I0216 21:41:22.789048 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfpk6\" (UniqueName: \"kubernetes.io/projected/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7-kube-api-access-mfpk6\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.164555 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.164521 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b8a62d27-1d28-4ba4-8f2e-b4d2607aade0","Type":"ContainerDied","Data":"583e9a10b51b19a8c6c16de3ef6cddddd8ed5d385e14728a4ed6fc226087f63b"} Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.164697 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="583e9a10b51b19a8c6c16de3ef6cddddd8ed5d385e14728a4ed6fc226087f63b" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.167105 4777 generic.go:334] "Generic (PLEG): container finished" podID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerID="c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00" exitCode=0 Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.167179 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kl64h" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.167201 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerDied","Data":"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00"} Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.167240 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kl64h" event={"ID":"4b1eaca3-9a44-4887-98cf-7cb2a4d406b7","Type":"ContainerDied","Data":"0fe8b3477264fbb91cc8d1d6dce0112d81cffe2b2e1be543d1489fe18ee68e20"} Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.167261 4777 scope.go:117] "RemoveContainer" containerID="c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.184656 4777 scope.go:117] "RemoveContainer" containerID="0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.205081 4777 scope.go:117] "RemoveContainer" containerID="5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.209278 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.214068 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kl64h"] Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.224961 4777 scope.go:117] "RemoveContainer" containerID="c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00" Feb 16 21:41:23 crc kubenswrapper[4777]: E0216 21:41:23.225514 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00\": container with ID starting with c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00 not found: ID does not exist" containerID="c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.225557 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00"} err="failed to get container status \"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00\": rpc error: code = NotFound desc = could not find container \"c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00\": container with ID starting with c32867ccbf5a96e3eee99549307e6f4f7f6b1028664a8629b96e803f53fecb00 not found: ID does not exist" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.225619 4777 scope.go:117] "RemoveContainer" containerID="0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667" Feb 16 21:41:23 crc kubenswrapper[4777]: E0216 21:41:23.225848 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667\": container with ID starting with 0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667 not found: ID does not exist" containerID="0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.225874 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667"} err="failed to get container status \"0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667\": rpc error: code = NotFound desc = could not find container \"0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667\": container with ID starting with 0532d2746296d91bc5e3b5410ae54218733c19d3ae6cf03e0013683b838e1667 not found: ID does not exist" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.225889 4777 scope.go:117] "RemoveContainer" containerID="5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892" Feb 16 21:41:23 crc kubenswrapper[4777]: E0216 21:41:23.226177 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892\": container with ID starting with 5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892 not found: ID does not exist" containerID="5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892" Feb 16 21:41:23 crc kubenswrapper[4777]: I0216 21:41:23.226205 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892"} err="failed to get container status \"5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892\": rpc error: code = NotFound desc = could not find container \"5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892\": container with ID starting with 5128df3cf414790370c341ba0a8bea9077d36d80b8700da9b2454b097d725892 not found: ID does not exist" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.191194 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" path="/var/lib/kubelet/pods/4b1eaca3-9a44-4887-98cf-7cb2a4d406b7/volumes" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492041 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 21:41:24 crc kubenswrapper[4777]: E0216 21:41:24.492326 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="extract-content" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492343 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="extract-content" Feb 16 21:41:24 crc kubenswrapper[4777]: E0216 21:41:24.492353 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="registry-server" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492361 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="registry-server" Feb 16 21:41:24 crc kubenswrapper[4777]: E0216 21:41:24.492373 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" containerName="pruner" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492381 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" containerName="pruner" Feb 16 21:41:24 crc kubenswrapper[4777]: E0216 21:41:24.492395 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="extract-utilities" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492402 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="extract-utilities" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492523 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a62d27-1d28-4ba4-8f2e-b4d2607aade0" containerName="pruner" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492542 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1eaca3-9a44-4887-98cf-7cb2a4d406b7" containerName="registry-server" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.492991 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.495638 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.496161 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.503686 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.614517 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.614732 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.614825 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.715889 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.716063 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.716210 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.716247 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.716367 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.737630 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access\") pod \"installer-9-crc\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:24 crc kubenswrapper[4777]: I0216 21:41:24.836088 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:41:25 crc kubenswrapper[4777]: I0216 21:41:25.183814 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerStarted","Data":"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c"} Feb 16 21:41:25 crc kubenswrapper[4777]: I0216 21:41:25.185686 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerStarted","Data":"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953"} Feb 16 21:41:25 crc kubenswrapper[4777]: I0216 21:41:25.279618 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 16 21:41:25 crc kubenswrapper[4777]: W0216 21:41:25.347911 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0608d7ad_d8d2_494f_a827_d7ebf1e78a78.slice/crio-630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c WatchSource:0}: Error finding container 630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c: Status 404 returned error can't find the container with id 630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.204207 4777 generic.go:334] "Generic (PLEG): container finished" podID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerID="f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c" exitCode=0 Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.204316 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerDied","Data":"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c"} Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.206796 4777 generic.go:334] "Generic (PLEG): container finished" podID="c6f1ee11-a9b8-4398-a964-04d976458836" containerID="104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953" exitCode=0 Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.206845 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerDied","Data":"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953"} Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.210407 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0608d7ad-d8d2-494f-a827-d7ebf1e78a78","Type":"ContainerStarted","Data":"ea078351301abbec1e9fdd0d18e0107a39dd0eeec78a62b391cb40bc5de29ab8"} Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.210435 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0608d7ad-d8d2-494f-a827-d7ebf1e78a78","Type":"ContainerStarted","Data":"630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c"} Feb 16 21:41:26 crc kubenswrapper[4777]: I0216 21:41:26.252249 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.252224839 podStartE2EDuration="2.252224839s" podCreationTimestamp="2026-02-16 21:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:41:26.248784379 +0000 UTC m=+206.831285501" watchObservedRunningTime="2026-02-16 21:41:26.252224839 +0000 UTC m=+206.834725941" Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.222104 4777 generic.go:334] "Generic (PLEG): container finished" podID="01c46266-f08a-405d-8768-8075a15bb61d" containerID="7765449e873a85b6c1547f45164cc9a3b1826e2d29f00c2f66efb31607476b4c" exitCode=0 Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.222192 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerDied","Data":"7765449e873a85b6c1547f45164cc9a3b1826e2d29f00c2f66efb31607476b4c"} Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.225432 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerStarted","Data":"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3"} Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.230736 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerStarted","Data":"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d"} Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.294532 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mv49x" podStartSLOduration=2.424809878 podStartE2EDuration="46.294510094s" podCreationTimestamp="2026-02-16 21:40:41 +0000 UTC" firstStartedPulling="2026-02-16 21:40:42.71173495 +0000 UTC m=+163.294236052" lastFinishedPulling="2026-02-16 21:41:26.581435126 +0000 UTC m=+207.163936268" observedRunningTime="2026-02-16 21:41:27.292495316 +0000 UTC m=+207.874996428" watchObservedRunningTime="2026-02-16 21:41:27.294510094 +0000 UTC m=+207.877011206" Feb 16 21:41:27 crc kubenswrapper[4777]: I0216 21:41:27.320273 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbfdc" podStartSLOduration=2.468544526 podStartE2EDuration="45.3202426s" podCreationTimestamp="2026-02-16 21:40:42 +0000 UTC" firstStartedPulling="2026-02-16 21:40:43.767667031 +0000 UTC m=+164.350168133" lastFinishedPulling="2026-02-16 21:41:26.619365105 +0000 UTC m=+207.201866207" observedRunningTime="2026-02-16 21:41:27.30885251 +0000 UTC m=+207.891353612" watchObservedRunningTime="2026-02-16 21:41:27.3202426 +0000 UTC m=+207.902743712" Feb 16 21:41:28 crc kubenswrapper[4777]: I0216 21:41:28.237762 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerStarted","Data":"ed11a9f3b67d2c0d79afeb4bd18f007d8b2d2fcf5b9385c286ad938382d7b610"} Feb 16 21:41:28 crc kubenswrapper[4777]: I0216 21:41:28.239077 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerStarted","Data":"a8f92bfa7ebe9c97001da2914e8e3b3124a6794ccbfc182cb5fc12a753c33e1b"} Feb 16 21:41:28 crc kubenswrapper[4777]: I0216 21:41:28.240925 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerStarted","Data":"b9e01d6bf7484354224c17867982d582bfa0ed5182c2f9099ce1af9f984441a6"} Feb 16 21:41:28 crc kubenswrapper[4777]: I0216 21:41:28.242774 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerStarted","Data":"bac295f9485b020674522d695d0fe63dd2cbe3bf620785ea9efd01e6d17c1e21"} Feb 16 21:41:28 crc kubenswrapper[4777]: I0216 21:41:28.334934 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k68t2" podStartSLOduration=2.429170735 podStartE2EDuration="47.334912935s" podCreationTimestamp="2026-02-16 21:40:41 +0000 UTC" firstStartedPulling="2026-02-16 21:40:42.711695779 +0000 UTC m=+163.294196891" lastFinishedPulling="2026-02-16 21:41:27.617437989 +0000 UTC m=+208.199939091" observedRunningTime="2026-02-16 21:41:28.331725353 +0000 UTC m=+208.914226455" watchObservedRunningTime="2026-02-16 21:41:28.334912935 +0000 UTC m=+208.917414037" Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.253199 4777 generic.go:334] "Generic (PLEG): container finished" podID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerID="ed11a9f3b67d2c0d79afeb4bd18f007d8b2d2fcf5b9385c286ad938382d7b610" exitCode=0 Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.253275 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerDied","Data":"ed11a9f3b67d2c0d79afeb4bd18f007d8b2d2fcf5b9385c286ad938382d7b610"} Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.256110 4777 generic.go:334] "Generic (PLEG): container finished" podID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerID="a8f92bfa7ebe9c97001da2914e8e3b3124a6794ccbfc182cb5fc12a753c33e1b" exitCode=0 Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.256153 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerDied","Data":"a8f92bfa7ebe9c97001da2914e8e3b3124a6794ccbfc182cb5fc12a753c33e1b"} Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.260649 4777 generic.go:334] "Generic (PLEG): container finished" podID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerID="bac295f9485b020674522d695d0fe63dd2cbe3bf620785ea9efd01e6d17c1e21" exitCode=0 Feb 16 21:41:29 crc kubenswrapper[4777]: I0216 21:41:29.260684 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerDied","Data":"bac295f9485b020674522d695d0fe63dd2cbe3bf620785ea9efd01e6d17c1e21"} Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.348067 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.348165 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.406339 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.811880 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.811965 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:31 crc kubenswrapper[4777]: I0216 21:41:31.865266 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:32 crc kubenswrapper[4777]: I0216 21:41:32.341738 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:41:32 crc kubenswrapper[4777]: I0216 21:41:32.359628 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:32 crc kubenswrapper[4777]: I0216 21:41:32.653241 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:32 crc kubenswrapper[4777]: I0216 21:41:32.653312 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:33 crc kubenswrapper[4777]: I0216 21:41:33.726527 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbfdc" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="registry-server" probeResult="failure" output=< Feb 16 21:41:33 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 21:41:33 crc kubenswrapper[4777]: > Feb 16 21:41:34 crc kubenswrapper[4777]: I0216 21:41:34.324335 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerStarted","Data":"fe789c04e2a9539ab8e6cc106f96e70ef884fca95069b6136608e6a240fc54e0"} Feb 16 21:41:34 crc kubenswrapper[4777]: I0216 21:41:34.326569 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerStarted","Data":"a46d7b73b85e983f3e5a8f25f4054803d13557fbaeebf57e6b84a4f95b7e0854"} Feb 16 21:41:34 crc kubenswrapper[4777]: I0216 21:41:34.328174 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerStarted","Data":"ca18a989d5d75588fb4c909d29d32f479c56fa5a9a50ae5c903273f12d93c3a1"} Feb 16 21:41:34 crc kubenswrapper[4777]: I0216 21:41:34.330411 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerStarted","Data":"629b899e1d5dc8bcb2a60f4b27ef8e0f5a975fd73c9ce40270f2f460998b0e8c"} Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.338164 4777 generic.go:334] "Generic (PLEG): container finished" podID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerID="ca18a989d5d75588fb4c909d29d32f479c56fa5a9a50ae5c903273f12d93c3a1" exitCode=0 Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.338219 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerDied","Data":"ca18a989d5d75588fb4c909d29d32f479c56fa5a9a50ae5c903273f12d93c3a1"} Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.363867 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nrsdv" podStartSLOduration=4.374188642 podStartE2EDuration="56.363839263s" podCreationTimestamp="2026-02-16 21:40:39 +0000 UTC" firstStartedPulling="2026-02-16 21:40:40.599495978 +0000 UTC m=+161.181997080" lastFinishedPulling="2026-02-16 21:41:32.589146559 +0000 UTC m=+213.171647701" observedRunningTime="2026-02-16 21:41:34.355085779 +0000 UTC m=+214.937586901" watchObservedRunningTime="2026-02-16 21:41:35.363839263 +0000 UTC m=+215.946340405" Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.393499 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9wtvs" podStartSLOduration=4.359950604 podStartE2EDuration="56.393475782s" podCreationTimestamp="2026-02-16 21:40:39 +0000 UTC" firstStartedPulling="2026-02-16 21:40:41.627746851 +0000 UTC m=+162.210247953" lastFinishedPulling="2026-02-16 21:41:33.661271979 +0000 UTC m=+214.243773131" observedRunningTime="2026-02-16 21:41:35.389837956 +0000 UTC m=+215.972339068" watchObservedRunningTime="2026-02-16 21:41:35.393475782 +0000 UTC m=+215.975976914" Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.393731 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzxjq" podStartSLOduration=4.59504529 podStartE2EDuration="57.393724739s" podCreationTimestamp="2026-02-16 21:40:38 +0000 UTC" firstStartedPulling="2026-02-16 21:40:40.585173483 +0000 UTC m=+161.167674585" lastFinishedPulling="2026-02-16 21:41:33.383852902 +0000 UTC m=+213.966354034" observedRunningTime="2026-02-16 21:41:35.376005425 +0000 UTC m=+215.958506527" watchObservedRunningTime="2026-02-16 21:41:35.393724739 +0000 UTC m=+215.976225871" Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.636914 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:41:35 crc kubenswrapper[4777]: I0216 21:41:35.637301 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mv49x" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="registry-server" containerID="cri-o://45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3" gracePeriod=2 Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.040988 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.174683 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities\") pod \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.174766 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhx9b\" (UniqueName: \"kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b\") pod \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.174807 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content\") pod \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\" (UID: \"1e9ca7b4-abe8-480b-8d82-a44eea03d22e\") " Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.175836 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities" (OuterVolumeSpecName: "utilities") pod "1e9ca7b4-abe8-480b-8d82-a44eea03d22e" (UID: "1e9ca7b4-abe8-480b-8d82-a44eea03d22e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.187176 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b" (OuterVolumeSpecName: "kube-api-access-jhx9b") pod "1e9ca7b4-abe8-480b-8d82-a44eea03d22e" (UID: "1e9ca7b4-abe8-480b-8d82-a44eea03d22e"). InnerVolumeSpecName "kube-api-access-jhx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.203062 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e9ca7b4-abe8-480b-8d82-a44eea03d22e" (UID: "1e9ca7b4-abe8-480b-8d82-a44eea03d22e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.276558 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.276590 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.276600 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhx9b\" (UniqueName: \"kubernetes.io/projected/1e9ca7b4-abe8-480b-8d82-a44eea03d22e-kube-api-access-jhx9b\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.351315 4777 generic.go:334] "Generic (PLEG): container finished" podID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerID="45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3" exitCode=0 Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.351376 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerDied","Data":"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3"} Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.351406 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mv49x" event={"ID":"1e9ca7b4-abe8-480b-8d82-a44eea03d22e","Type":"ContainerDied","Data":"eb34482ee5f3b8f5b281d9f1f4a332036df0802ad43df813e45bdb6321e713c1"} Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.351428 4777 scope.go:117] "RemoveContainer" containerID="45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.351534 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mv49x" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.357001 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerStarted","Data":"63ba03eead54b2094492589d7645f4e4f852c1b89093cf0c4b209fd11e63cdc9"} Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.386658 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b76h6" podStartSLOduration=2.464744305 podStartE2EDuration="54.386634303s" podCreationTimestamp="2026-02-16 21:40:42 +0000 UTC" firstStartedPulling="2026-02-16 21:40:43.788273038 +0000 UTC m=+164.370774140" lastFinishedPulling="2026-02-16 21:41:35.710163016 +0000 UTC m=+216.292664138" observedRunningTime="2026-02-16 21:41:36.38238743 +0000 UTC m=+216.964888562" watchObservedRunningTime="2026-02-16 21:41:36.386634303 +0000 UTC m=+216.969135415" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.391058 4777 scope.go:117] "RemoveContainer" containerID="f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.399059 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.403563 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mv49x"] Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.414695 4777 scope.go:117] "RemoveContainer" containerID="b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.437268 4777 scope.go:117] "RemoveContainer" containerID="45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3" Feb 16 21:41:36 crc kubenswrapper[4777]: E0216 21:41:36.437809 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3\": container with ID starting with 45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3 not found: ID does not exist" containerID="45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.437877 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3"} err="failed to get container status \"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3\": rpc error: code = NotFound desc = could not find container \"45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3\": container with ID starting with 45a0c281b0332cec309c9105955b5620dfe0a220c9a4f2d8be6ca32cab33d3c3 not found: ID does not exist" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.437923 4777 scope.go:117] "RemoveContainer" containerID="f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c" Feb 16 21:41:36 crc kubenswrapper[4777]: E0216 21:41:36.438286 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c\": container with ID starting with f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c not found: ID does not exist" containerID="f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.438321 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c"} err="failed to get container status \"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c\": rpc error: code = NotFound desc = could not find container \"f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c\": container with ID starting with f6a08d68562293cd812c99b1f5367a218541855db53e29363a9a03c8bbc5291c not found: ID does not exist" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.438353 4777 scope.go:117] "RemoveContainer" containerID="b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4" Feb 16 21:41:36 crc kubenswrapper[4777]: E0216 21:41:36.438546 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4\": container with ID starting with b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4 not found: ID does not exist" containerID="b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4" Feb 16 21:41:36 crc kubenswrapper[4777]: I0216 21:41:36.438569 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4"} err="failed to get container status \"b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4\": rpc error: code = NotFound desc = could not find container \"b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4\": container with ID starting with b9dd490ad3e75b5fbc8797be48a48c448d01272bf060674cf8e8cbf00ddfe1e4 not found: ID does not exist" Feb 16 21:41:38 crc kubenswrapper[4777]: I0216 21:41:38.193576 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" path="/var/lib/kubelet/pods/1e9ca7b4-abe8-480b-8d82-a44eea03d22e/volumes" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.168121 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.168537 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.234061 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.375704 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.376761 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.419952 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.446895 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.800191 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.800298 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:39 crc kubenswrapper[4777]: I0216 21:41:39.873459 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:40 crc kubenswrapper[4777]: I0216 21:41:40.459633 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:40 crc kubenswrapper[4777]: I0216 21:41:40.472869 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:41:41 crc kubenswrapper[4777]: I0216 21:41:41.652242 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:41:41 crc kubenswrapper[4777]: I0216 21:41:41.652351 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:41:41 crc kubenswrapper[4777]: I0216 21:41:41.652429 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:41:41 crc kubenswrapper[4777]: I0216 21:41:41.653539 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:41:41 crc kubenswrapper[4777]: I0216 21:41:41.653668 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9" gracePeriod=600 Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.373477 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.374121 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.419819 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9" exitCode=0 Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.419880 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9"} Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.419916 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9"} Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.716209 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:42 crc kubenswrapper[4777]: I0216 21:41:42.768662 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:43 crc kubenswrapper[4777]: I0216 21:41:43.421831 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b76h6" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="registry-server" probeResult="failure" output=< Feb 16 21:41:43 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 21:41:43 crc kubenswrapper[4777]: > Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.037273 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.038105 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9wtvs" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="registry-server" containerID="cri-o://a46d7b73b85e983f3e5a8f25f4054803d13557fbaeebf57e6b84a4f95b7e0854" gracePeriod=2 Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.442232 4777 generic.go:334] "Generic (PLEG): container finished" podID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerID="a46d7b73b85e983f3e5a8f25f4054803d13557fbaeebf57e6b84a4f95b7e0854" exitCode=0 Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.442291 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerDied","Data":"a46d7b73b85e983f3e5a8f25f4054803d13557fbaeebf57e6b84a4f95b7e0854"} Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.571499 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.705361 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities\") pod \"950a7269-d4a8-4d38-b02f-c41037ef0341\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.705507 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content\") pod \"950a7269-d4a8-4d38-b02f-c41037ef0341\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.705617 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgbvk\" (UniqueName: \"kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk\") pod \"950a7269-d4a8-4d38-b02f-c41037ef0341\" (UID: \"950a7269-d4a8-4d38-b02f-c41037ef0341\") " Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.707133 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities" (OuterVolumeSpecName: "utilities") pod "950a7269-d4a8-4d38-b02f-c41037ef0341" (UID: "950a7269-d4a8-4d38-b02f-c41037ef0341"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.719261 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk" (OuterVolumeSpecName: "kube-api-access-bgbvk") pod "950a7269-d4a8-4d38-b02f-c41037ef0341" (UID: "950a7269-d4a8-4d38-b02f-c41037ef0341"). InnerVolumeSpecName "kube-api-access-bgbvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.783454 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "950a7269-d4a8-4d38-b02f-c41037ef0341" (UID: "950a7269-d4a8-4d38-b02f-c41037ef0341"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.807337 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.807387 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/950a7269-d4a8-4d38-b02f-c41037ef0341-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:44 crc kubenswrapper[4777]: I0216 21:41:44.807420 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgbvk\" (UniqueName: \"kubernetes.io/projected/950a7269-d4a8-4d38-b02f-c41037ef0341-kube-api-access-bgbvk\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.454670 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wtvs" event={"ID":"950a7269-d4a8-4d38-b02f-c41037ef0341","Type":"ContainerDied","Data":"65262afdd2f30c358f2d7ba58414d3a746bbdf35645ef2f63e9e77158eccdd9b"} Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.454705 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wtvs" Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.454819 4777 scope.go:117] "RemoveContainer" containerID="a46d7b73b85e983f3e5a8f25f4054803d13557fbaeebf57e6b84a4f95b7e0854" Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.481216 4777 scope.go:117] "RemoveContainer" containerID="ed11a9f3b67d2c0d79afeb4bd18f007d8b2d2fcf5b9385c286ad938382d7b610" Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.514855 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.519910 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9wtvs"] Feb 16 21:41:45 crc kubenswrapper[4777]: I0216 21:41:45.520471 4777 scope.go:117] "RemoveContainer" containerID="5749f21752d6705fe64765fa47ace21e3ebd93b9c77ffc1f4aa55e4053dede73" Feb 16 21:41:46 crc kubenswrapper[4777]: I0216 21:41:46.193432 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" path="/var/lib/kubelet/pods/950a7269-d4a8-4d38-b02f-c41037ef0341/volumes" Feb 16 21:41:46 crc kubenswrapper[4777]: I0216 21:41:46.438689 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:41:46 crc kubenswrapper[4777]: I0216 21:41:46.439432 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbfdc" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="registry-server" containerID="cri-o://392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d" gracePeriod=2 Feb 16 21:41:46 crc kubenswrapper[4777]: I0216 21:41:46.960151 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.148169 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content\") pod \"c6f1ee11-a9b8-4398-a964-04d976458836\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.148274 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr27j\" (UniqueName: \"kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j\") pod \"c6f1ee11-a9b8-4398-a964-04d976458836\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.148414 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities\") pod \"c6f1ee11-a9b8-4398-a964-04d976458836\" (UID: \"c6f1ee11-a9b8-4398-a964-04d976458836\") " Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.149914 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities" (OuterVolumeSpecName: "utilities") pod "c6f1ee11-a9b8-4398-a964-04d976458836" (UID: "c6f1ee11-a9b8-4398-a964-04d976458836"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.161560 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j" (OuterVolumeSpecName: "kube-api-access-xr27j") pod "c6f1ee11-a9b8-4398-a964-04d976458836" (UID: "c6f1ee11-a9b8-4398-a964-04d976458836"). InnerVolumeSpecName "kube-api-access-xr27j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.249788 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.250217 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr27j\" (UniqueName: \"kubernetes.io/projected/c6f1ee11-a9b8-4398-a964-04d976458836-kube-api-access-xr27j\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.312334 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6f1ee11-a9b8-4398-a964-04d976458836" (UID: "c6f1ee11-a9b8-4398-a964-04d976458836"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.351856 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f1ee11-a9b8-4398-a964-04d976458836-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.472922 4777 generic.go:334] "Generic (PLEG): container finished" podID="c6f1ee11-a9b8-4398-a964-04d976458836" containerID="392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d" exitCode=0 Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.472984 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerDied","Data":"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d"} Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.473033 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbfdc" event={"ID":"c6f1ee11-a9b8-4398-a964-04d976458836","Type":"ContainerDied","Data":"28225ecf9d9fe59ac1baa23320e720604f4582fb6e7d2112c5736dfaa97826c8"} Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.473060 4777 scope.go:117] "RemoveContainer" containerID="392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.473143 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbfdc" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.490663 4777 scope.go:117] "RemoveContainer" containerID="104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.515406 4777 scope.go:117] "RemoveContainer" containerID="88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.536412 4777 scope.go:117] "RemoveContainer" containerID="392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d" Feb 16 21:41:47 crc kubenswrapper[4777]: E0216 21:41:47.538404 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d\": container with ID starting with 392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d not found: ID does not exist" containerID="392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.538459 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d"} err="failed to get container status \"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d\": rpc error: code = NotFound desc = could not find container \"392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d\": container with ID starting with 392e2229fc58e57799d06caf4e8e62fceb245cd9c6cf8633eaeab622d449a25d not found: ID does not exist" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.538500 4777 scope.go:117] "RemoveContainer" containerID="104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953" Feb 16 21:41:47 crc kubenswrapper[4777]: E0216 21:41:47.538969 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953\": container with ID starting with 104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953 not found: ID does not exist" containerID="104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.539023 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953"} err="failed to get container status \"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953\": rpc error: code = NotFound desc = could not find container \"104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953\": container with ID starting with 104a7952028a85cc6aa920795b4402e916a6b5c0798f8c850d96ebc9c3a27953 not found: ID does not exist" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.539064 4777 scope.go:117] "RemoveContainer" containerID="88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8" Feb 16 21:41:47 crc kubenswrapper[4777]: E0216 21:41:47.539380 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8\": container with ID starting with 88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8 not found: ID does not exist" containerID="88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.539416 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8"} err="failed to get container status \"88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8\": rpc error: code = NotFound desc = could not find container \"88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8\": container with ID starting with 88d34a3bdc2ee593da45feeeb65a5c7e2c278b998976d8b97dbae3c34bb9b3a8 not found: ID does not exist" Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.542413 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:41:47 crc kubenswrapper[4777]: I0216 21:41:47.546927 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbfdc"] Feb 16 21:41:48 crc kubenswrapper[4777]: I0216 21:41:48.189174 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" path="/var/lib/kubelet/pods/c6f1ee11-a9b8-4398-a964-04d976458836/volumes" Feb 16 21:41:51 crc kubenswrapper[4777]: I0216 21:41:51.487508 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5pdvt"] Feb 16 21:41:52 crc kubenswrapper[4777]: I0216 21:41:52.426115 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:41:52 crc kubenswrapper[4777]: I0216 21:41:52.493279 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.628771 4777 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629523 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629539 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629551 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629558 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629566 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629574 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629584 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629590 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629601 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629606 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629622 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629629 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629636 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629642 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629651 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629656 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="extract-utilities" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.629664 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629669 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="extract-content" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629773 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f1ee11-a9b8-4398-a964-04d976458836" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629785 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9ca7b4-abe8-480b-8d82-a44eea03d22e" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.629792 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="950a7269-d4a8-4d38-b02f-c41037ef0341" containerName="registry-server" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630176 4777 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630360 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630470 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51" gracePeriod=15 Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630616 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a" gracePeriod=15 Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630595 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996" gracePeriod=15 Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630606 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d" gracePeriod=15 Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.630801 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9" gracePeriod=15 Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.631968 4777 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632105 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632115 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632124 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632130 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632143 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632155 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632169 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632176 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632188 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632194 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632211 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632218 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 16 21:42:03 crc kubenswrapper[4777]: E0216 21:42:03.632226 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632232 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632315 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632326 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632334 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632341 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632350 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.632550 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.677100 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.713845 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.713922 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.713946 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.713971 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.713994 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.714071 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.714095 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.714165 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817209 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817255 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817284 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817306 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817337 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817363 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817441 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817537 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817514 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817488 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817566 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817606 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817618 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817634 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.817685 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:03 crc kubenswrapper[4777]: I0216 21:42:03.974280 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:04 crc kubenswrapper[4777]: W0216 21:42:04.011489 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a89b3368c7952919f8870ef3a6cd83962f7ceda12509f57b6082f27486e5fb92 WatchSource:0}: Error finding container a89b3368c7952919f8870ef3a6cd83962f7ceda12509f57b6082f27486e5fb92: Status 404 returned error can't find the container with id a89b3368c7952919f8870ef3a6cd83962f7ceda12509f57b6082f27486e5fb92 Feb 16 21:42:04 crc kubenswrapper[4777]: E0216 21:42:04.017209 4777 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.203:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894d811b45e06cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-16 21:42:04.015552205 +0000 UTC m=+244.598053307,LastTimestamp:2026-02-16 21:42:04.015552205 +0000 UTC m=+244.598053307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.629940 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f3e389f78403ebe8b0abb95cf04041926a80fc1e389a87164f235f7e510a758d"} Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.630278 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a89b3368c7952919f8870ef3a6cd83962f7ceda12509f57b6082f27486e5fb92"} Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.631923 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.632695 4777 generic.go:334] "Generic (PLEG): container finished" podID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" containerID="ea078351301abbec1e9fdd0d18e0107a39dd0eeec78a62b391cb40bc5de29ab8" exitCode=0 Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.632759 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0608d7ad-d8d2-494f-a827-d7ebf1e78a78","Type":"ContainerDied","Data":"ea078351301abbec1e9fdd0d18e0107a39dd0eeec78a62b391cb40bc5de29ab8"} Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.633987 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.634405 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.636105 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.637913 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.639359 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9" exitCode=0 Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.639399 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996" exitCode=0 Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.639416 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d" exitCode=0 Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.639431 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a" exitCode=2 Feb 16 21:42:04 crc kubenswrapper[4777]: I0216 21:42:04.639463 4777 scope.go:117] "RemoveContainer" containerID="43496f51879dd9ea756ff6cd021b630cbc36e7299a93a41abc6273401c02337b" Feb 16 21:42:05 crc kubenswrapper[4777]: I0216 21:42:05.649655 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.045339 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.047093 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.048056 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.048953 4777 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.049665 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.050985 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.051287 4777 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.051623 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.052215 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155365 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155461 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir\") pod \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155522 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access\") pod \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155558 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155589 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155633 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155678 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155767 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0608d7ad-d8d2-494f-a827-d7ebf1e78a78" (UID: "0608d7ad-d8d2-494f-a827-d7ebf1e78a78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155918 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock\") pod \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\" (UID: \"0608d7ad-d8d2-494f-a827-d7ebf1e78a78\") " Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.155957 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock" (OuterVolumeSpecName: "var-lock") pod "0608d7ad-d8d2-494f-a827-d7ebf1e78a78" (UID: "0608d7ad-d8d2-494f-a827-d7ebf1e78a78"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156023 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156678 4777 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156738 4777 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156757 4777 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156777 4777 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.156797 4777 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-var-lock\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.163391 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0608d7ad-d8d2-494f-a827-d7ebf1e78a78" (UID: "0608d7ad-d8d2-494f-a827-d7ebf1e78a78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.195302 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.258103 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0608d7ad-d8d2-494f-a827-d7ebf1e78a78-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.673378 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0608d7ad-d8d2-494f-a827-d7ebf1e78a78","Type":"ContainerDied","Data":"630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c"} Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.673444 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.673464 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630414cb935ab50887d274ed3444ed22703f2b5d8799a9850b70777cd4801d2c" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.678969 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.679945 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.680726 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.681130 4777 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51" exitCode=0 Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.681186 4777 scope.go:117] "RemoveContainer" containerID="9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.681342 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.682315 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.683042 4777 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.683389 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.685773 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.686200 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.686661 4777 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.715853 4777 scope.go:117] "RemoveContainer" containerID="fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.752390 4777 scope.go:117] "RemoveContainer" containerID="f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.775950 4777 scope.go:117] "RemoveContainer" containerID="a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.803616 4777 scope.go:117] "RemoveContainer" containerID="df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.831952 4777 scope.go:117] "RemoveContainer" containerID="687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.868950 4777 scope.go:117] "RemoveContainer" containerID="9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.869580 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\": container with ID starting with 9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9 not found: ID does not exist" containerID="9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.869614 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9"} err="failed to get container status \"9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\": rpc error: code = NotFound desc = could not find container \"9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9\": container with ID starting with 9d56f0424f078ca144113195f946f01e88f4cf6916de1e14cfc671838b5f51a9 not found: ID does not exist" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.869646 4777 scope.go:117] "RemoveContainer" containerID="fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.870188 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\": container with ID starting with fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996 not found: ID does not exist" containerID="fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.870216 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996"} err="failed to get container status \"fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\": rpc error: code = NotFound desc = could not find container \"fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996\": container with ID starting with fee5a31721375c1b3d0b883999c377cbff67d096d510afd552b21ac51e82f996 not found: ID does not exist" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.870236 4777 scope.go:117] "RemoveContainer" containerID="f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.870591 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\": container with ID starting with f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d not found: ID does not exist" containerID="f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.870659 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d"} err="failed to get container status \"f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\": rpc error: code = NotFound desc = could not find container \"f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d\": container with ID starting with f80dc3767d8baebd763165a98d46862e23893b6c31d67a8b3614e8365e2d165d not found: ID does not exist" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.870698 4777 scope.go:117] "RemoveContainer" containerID="a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.871197 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\": container with ID starting with a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a not found: ID does not exist" containerID="a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.871263 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a"} err="failed to get container status \"a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\": rpc error: code = NotFound desc = could not find container \"a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a\": container with ID starting with a7f1500706cf1ef195e8f0bdf2570dd90de7d8e54d7914d6c9e29ed17fe3949a not found: ID does not exist" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.871308 4777 scope.go:117] "RemoveContainer" containerID="df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.872530 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\": container with ID starting with df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51 not found: ID does not exist" containerID="df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.872595 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51"} err="failed to get container status \"df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\": rpc error: code = NotFound desc = could not find container \"df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51\": container with ID starting with df85dc5885f76281b236668b7926372432db513552800b6e41b29133ec69fb51 not found: ID does not exist" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.872618 4777 scope.go:117] "RemoveContainer" containerID="687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae" Feb 16 21:42:06 crc kubenswrapper[4777]: E0216 21:42:06.872953 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\": container with ID starting with 687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae not found: ID does not exist" containerID="687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae" Feb 16 21:42:06 crc kubenswrapper[4777]: I0216 21:42:06.872991 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae"} err="failed to get container status \"687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\": rpc error: code = NotFound desc = could not find container \"687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae\": container with ID starting with 687221054c7c289caa194f14a620ae44d37b93965625119be5034bdc36b9baae not found: ID does not exist" Feb 16 21:42:08 crc kubenswrapper[4777]: E0216 21:42:08.277405 4777 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.203:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894d811b45e06cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-16 21:42:04.015552205 +0000 UTC m=+244.598053307,LastTimestamp:2026-02-16 21:42:04.015552205 +0000 UTC m=+244.598053307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 16 21:42:10 crc kubenswrapper[4777]: I0216 21:42:10.186443 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: I0216 21:42:10.187934 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.506142 4777 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.506977 4777 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.507691 4777 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.508082 4777 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.508652 4777 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:10 crc kubenswrapper[4777]: I0216 21:42:10.508761 4777 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.509248 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="200ms" Feb 16 21:42:10 crc kubenswrapper[4777]: E0216 21:42:10.710298 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="400ms" Feb 16 21:42:11 crc kubenswrapper[4777]: E0216 21:42:11.112301 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="800ms" Feb 16 21:42:11 crc kubenswrapper[4777]: E0216 21:42:11.914135 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="1.6s" Feb 16 21:42:13 crc kubenswrapper[4777]: E0216 21:42:13.516864 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="3.2s" Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.513804 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerName="oauth-openshift" containerID="cri-o://46a9bdcbc65b5e3b249f67d2b481d6168376ffbeb75dc0763ce7429c09097bc9" gracePeriod=15 Feb 16 21:42:16 crc kubenswrapper[4777]: E0216 21:42:16.718778 4777 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.203:6443: connect: connection refused" interval="6.4s" Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.769156 4777 generic.go:334] "Generic (PLEG): container finished" podID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerID="46a9bdcbc65b5e3b249f67d2b481d6168376ffbeb75dc0763ce7429c09097bc9" exitCode=0 Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.769242 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" event={"ID":"fc7762e5-2274-4c99-8d8c-0fb23340150f","Type":"ContainerDied","Data":"46a9bdcbc65b5e3b249f67d2b481d6168376ffbeb75dc0763ce7429c09097bc9"} Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.973836 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.975075 4777 status_manager.go:851] "Failed to get status for pod" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5pdvt\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.975633 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:16 crc kubenswrapper[4777]: I0216 21:42:16.976104 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.118915 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119119 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119173 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119214 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw76z\" (UniqueName: \"kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119262 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119299 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119293 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119354 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119394 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119426 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119470 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119511 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119558 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119612 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.119671 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template\") pod \"fc7762e5-2274-4c99-8d8c-0fb23340150f\" (UID: \"fc7762e5-2274-4c99-8d8c-0fb23340150f\") " Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.120935 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.120966 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.121366 4777 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.121398 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.121415 4777 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fc7762e5-2274-4c99-8d8c-0fb23340150f-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.121168 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.121563 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.127250 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.129832 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.129873 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z" (OuterVolumeSpecName: "kube-api-access-gw76z") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "kube-api-access-gw76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.130275 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.130968 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.131479 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.131929 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.132105 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.134765 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fc7762e5-2274-4c99-8d8c-0fb23340150f" (UID: "fc7762e5-2274-4c99-8d8c-0fb23340150f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.181750 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.183028 4777 status_manager.go:851] "Failed to get status for pod" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5pdvt\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.183420 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.183919 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.200767 4777 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.200817 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:17 crc kubenswrapper[4777]: E0216 21:42:17.201419 4777 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.202139 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222566 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222653 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw76z\" (UniqueName: \"kubernetes.io/projected/fc7762e5-2274-4c99-8d8c-0fb23340150f-kube-api-access-gw76z\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222682 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222741 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222772 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222797 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222823 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222846 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222868 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222887 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.222905 4777 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fc7762e5-2274-4c99-8d8c-0fb23340150f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:17 crc kubenswrapper[4777]: E0216 21:42:17.223088 4777 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.203:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" volumeName="registry-storage" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.779946 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" event={"ID":"fc7762e5-2274-4c99-8d8c-0fb23340150f","Type":"ContainerDied","Data":"0a7db6fe8ab82d3a0427e64d4bb252e957376e67e93c0e845b99f5c8955ede06"} Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.780046 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.780464 4777 scope.go:117] "RemoveContainer" containerID="46a9bdcbc65b5e3b249f67d2b481d6168376ffbeb75dc0763ce7429c09097bc9" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.781926 4777 status_manager.go:851] "Failed to get status for pod" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5pdvt\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.782317 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.782668 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.784040 4777 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3afa5c1ad5e930fde9c0270b3c43222b550a660dfdf419c0825dfe4ea0b34efd" exitCode=0 Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.784147 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3afa5c1ad5e930fde9c0270b3c43222b550a660dfdf419c0825dfe4ea0b34efd"} Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.784535 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a518d1cc3c18ea1fd1141a867d1bd0e49b2aaf8daf37fccd1be0c1b44f2a5f3e"} Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.785050 4777 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.785085 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:17 crc kubenswrapper[4777]: E0216 21:42:17.785763 4777 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.785775 4777 status_manager.go:851] "Failed to get status for pod" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5pdvt\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.786678 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.787177 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.810697 4777 status_manager.go:851] "Failed to get status for pod" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.811562 4777 status_manager.go:851] "Failed to get status for pod" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" pod="openshift-authentication/oauth-openshift-558db77b4-5pdvt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-5pdvt\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:17 crc kubenswrapper[4777]: I0216 21:42:17.812166 4777 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.203:6443: connect: connection refused" Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.796539 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"acdb6539c363080d7be966583987b141bd8f28aacdbbb18e158c409b49991470"} Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.796597 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a195e6c8e601067a15271bf6f33276f8de7617eb1379cba40c4f0ce8f0b7cb3"} Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.802177 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.802236 4777 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce" exitCode=1 Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.802269 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce"} Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.802852 4777 scope.go:117] "RemoveContainer" containerID="ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce" Feb 16 21:42:18 crc kubenswrapper[4777]: I0216 21:42:18.865674 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.711217 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.811614 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e82c9c9f827dfcc468ea688ffa2c4fa0b6fcc8b2f5d4d8cc75de4df0025b9de"} Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.811678 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b0e8a9c72cea829be06da7dd4bed1cb420f29c02416b7234d4c9ca7668d86a3a"} Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.811693 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e7992a2a1915962cbd8381a22ae25ab425af6024368c660a2b910431d6261926"} Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.812082 4777 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.812102 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.812329 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.815186 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 16 21:42:19 crc kubenswrapper[4777]: I0216 21:42:19.815285 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eddc5314f32b82f4be0c45c15f240220162b0d5b5bf593cbf8c7f7b42a2b8616"} Feb 16 21:42:22 crc kubenswrapper[4777]: I0216 21:42:22.202949 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:22 crc kubenswrapper[4777]: I0216 21:42:22.203898 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:22 crc kubenswrapper[4777]: I0216 21:42:22.212007 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:24 crc kubenswrapper[4777]: I0216 21:42:24.824527 4777 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:24 crc kubenswrapper[4777]: I0216 21:42:24.915780 4777 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eebedec7-fa5e-447d-adfc-2e502db6ccae" Feb 16 21:42:25 crc kubenswrapper[4777]: I0216 21:42:25.857037 4777 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:25 crc kubenswrapper[4777]: I0216 21:42:25.857085 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:25 crc kubenswrapper[4777]: I0216 21:42:25.860982 4777 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eebedec7-fa5e-447d-adfc-2e502db6ccae" Feb 16 21:42:25 crc kubenswrapper[4777]: I0216 21:42:25.864778 4777 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://3a195e6c8e601067a15271bf6f33276f8de7617eb1379cba40c4f0ce8f0b7cb3" Feb 16 21:42:25 crc kubenswrapper[4777]: I0216 21:42:25.864812 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.785978 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.786322 4777 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.786449 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.862750 4777 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.862813 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c4e8a841-543c-4825-9320-d66b0bc2438e" Feb 16 21:42:26 crc kubenswrapper[4777]: I0216 21:42:26.867020 4777 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eebedec7-fa5e-447d-adfc-2e502db6ccae" Feb 16 21:42:28 crc kubenswrapper[4777]: I0216 21:42:28.865638 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.184400 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.195946 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.344099 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.438932 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.709153 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.860245 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 16 21:42:35 crc kubenswrapper[4777]: I0216 21:42:35.957776 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.157010 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.427768 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.490985 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.529921 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.533094 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.787172 4777 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.787282 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.794671 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 16 21:42:36 crc kubenswrapper[4777]: I0216 21:42:36.931280 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.248153 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.360917 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.543326 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.560843 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.598062 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.661663 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.706271 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.757091 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.831530 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.893339 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 16 21:42:37 crc kubenswrapper[4777]: I0216 21:42:37.985641 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.037362 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.038654 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.047119 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.064003 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.155677 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.232540 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.269351 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.361915 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.370080 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.396104 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.423056 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.463611 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.506100 4777 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.514176 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.514142455 podStartE2EDuration="35.514142455s" podCreationTimestamp="2026-02-16 21:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:42:24.880904018 +0000 UTC m=+265.463405130" watchObservedRunningTime="2026-02-16 21:42:38.514142455 +0000 UTC m=+279.096643587" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.514550 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.515484 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5pdvt","openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.515568 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.523175 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.525165 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.551183 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.551148884 podStartE2EDuration="14.551148884s" podCreationTimestamp="2026-02-16 21:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:42:38.547475429 +0000 UTC m=+279.129976571" watchObservedRunningTime="2026-02-16 21:42:38.551148884 +0000 UTC m=+279.133650026" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.574352 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.611883 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.688629 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.853312 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.862159 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.960818 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 16 21:42:38 crc kubenswrapper[4777]: I0216 21:42:38.989686 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.030023 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.134197 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7455d675f6-qz7sh"] Feb 16 21:42:39 crc kubenswrapper[4777]: E0216 21:42:39.134689 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerName="oauth-openshift" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.134823 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerName="oauth-openshift" Feb 16 21:42:39 crc kubenswrapper[4777]: E0216 21:42:39.134924 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" containerName="installer" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.135006 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" containerName="installer" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.135212 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0608d7ad-d8d2-494f-a827-d7ebf1e78a78" containerName="installer" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.135302 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" containerName="oauth-openshift" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.135845 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.140627 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.140685 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.140897 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141057 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141134 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141059 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141255 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141415 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141451 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.141609 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.142087 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.144469 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.150743 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.151295 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.173672 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.229859 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249775 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249819 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249848 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-policies\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249879 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-error\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249901 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249925 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249950 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249968 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-login\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.249992 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-dir\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.250017 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslwz\" (UniqueName: \"kubernetes.io/projected/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-kube-api-access-vslwz\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.250036 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.250053 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.250078 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.250100 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-session\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.260568 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.266430 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352228 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352317 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352353 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-session\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352407 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352435 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352478 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-policies\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352514 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-error\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352544 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.352574 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.353883 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.353954 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-login\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.354003 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-dir\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.354053 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vslwz\" (UniqueName: \"kubernetes.io/projected/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-kube-api-access-vslwz\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.354089 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.355122 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.355206 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-dir\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.356766 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.356886 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.357062 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-audit-policies\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.361224 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-error\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.364861 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.365653 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.365928 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.373631 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.374368 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.376869 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-user-template-login\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.378432 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-v4-0-config-system-session\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.399891 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslwz\" (UniqueName: \"kubernetes.io/projected/fd57637d-6a5f-43f7-99be-cfb88ae06cdb-kube-api-access-vslwz\") pod \"oauth-openshift-7455d675f6-qz7sh\" (UID: \"fd57637d-6a5f-43f7-99be-cfb88ae06cdb\") " pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.467298 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.527696 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.553808 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.683681 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.705004 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.736538 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.761334 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.762751 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.804934 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.908733 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.908833 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.946708 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 16 21:42:39 crc kubenswrapper[4777]: I0216 21:42:39.968549 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.085098 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.110381 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.121349 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.154583 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.179943 4777 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.199447 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7762e5-2274-4c99-8d8c-0fb23340150f" path="/var/lib/kubelet/pods/fc7762e5-2274-4c99-8d8c-0fb23340150f/volumes" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.208086 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.208838 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.235678 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.269708 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.355901 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.371562 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.372149 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.409187 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.571436 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.627331 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.763701 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 16 21:42:40 crc kubenswrapper[4777]: I0216 21:42:40.781023 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.015125 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.019320 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.035624 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.251427 4777 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.259617 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.292445 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.307803 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.361675 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.420627 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.436961 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.456822 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.480584 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.537839 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.765795 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.800428 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.868946 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 16 21:42:41 crc kubenswrapper[4777]: I0216 21:42:41.947068 4777 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.081415 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.120344 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.125458 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.131612 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.175332 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.208448 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.279125 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.401270 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.406250 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.574758 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.581399 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.688817 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.739339 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.739398 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.760913 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.791704 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.807927 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.850689 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.881190 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 16 21:42:42 crc kubenswrapper[4777]: I0216 21:42:42.988170 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.139539 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.141628 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.193333 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.199245 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.216463 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.319382 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.383856 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.503489 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.629578 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.644873 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.646000 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.659865 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.686050 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.690633 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.724084 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.749982 4777 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.775854 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.828925 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.847329 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.905035 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.946417 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.954193 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.960009 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.981677 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 16 21:42:43 crc kubenswrapper[4777]: I0216 21:42:43.995441 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.015761 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.054470 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.240670 4777 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.262803 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.330090 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.369601 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.560623 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.630270 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.634553 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.718622 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.861296 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.885638 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.912338 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.929828 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.962608 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.985366 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 16 21:42:44 crc kubenswrapper[4777]: I0216 21:42:44.989047 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.017634 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.038830 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.053349 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.092373 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.183391 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.223615 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.239455 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.239796 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.240429 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.254082 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.490369 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.559848 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.680555 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.751176 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.766052 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.801926 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.995230 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 16 21:42:45 crc kubenswrapper[4777]: I0216 21:42:45.999000 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.059049 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.182517 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.216797 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.259117 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.308879 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.374774 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.382955 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.423869 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.441196 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.491506 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.541808 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.562076 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.608994 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.722253 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.743162 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.783497 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.786537 4777 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.786626 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.786804 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.787922 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"eddc5314f32b82f4be0c45c15f240220162b0d5b5bf593cbf8c7f7b42a2b8616"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.788190 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://eddc5314f32b82f4be0c45c15f240220162b0d5b5bf593cbf8c7f7b42a2b8616" gracePeriod=30 Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.820882 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.862517 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.876017 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.902195 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.911940 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.913086 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 16 21:42:46 crc kubenswrapper[4777]: I0216 21:42:46.933529 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.060819 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.067028 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.078054 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.191011 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.217132 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.256054 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.285516 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.421731 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.490803 4777 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.491185 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f3e389f78403ebe8b0abb95cf04041926a80fc1e389a87164f235f7e510a758d" gracePeriod=5 Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.524308 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.536424 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7455d675f6-qz7sh"] Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.717593 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.787738 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.852915 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.897943 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.904704 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7455d675f6-qz7sh"] Feb 16 21:42:47 crc kubenswrapper[4777]: I0216 21:42:47.904755 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 16 21:42:48 crc kubenswrapper[4777]: I0216 21:42:48.009763 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" event={"ID":"fd57637d-6a5f-43f7-99be-cfb88ae06cdb","Type":"ContainerStarted","Data":"56d815447c71179be11bac1cd258e3beefcc3cc50cd79d343b1d3781871db944"} Feb 16 21:42:48 crc kubenswrapper[4777]: I0216 21:42:48.027437 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 16 21:42:48 crc kubenswrapper[4777]: I0216 21:42:48.156441 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 16 21:42:48 crc kubenswrapper[4777]: I0216 21:42:48.815106 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 16 21:42:48 crc kubenswrapper[4777]: I0216 21:42:48.906444 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.021984 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" event={"ID":"fd57637d-6a5f-43f7-99be-cfb88ae06cdb","Type":"ContainerStarted","Data":"40809e986ea53f49c63d651cddc5f6746af4d6eea0d98ba0804c20fb649b5961"} Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.022451 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.032118 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.081627 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7455d675f6-qz7sh" podStartSLOduration=58.081594412 podStartE2EDuration="58.081594412s" podCreationTimestamp="2026-02-16 21:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:42:49.056987057 +0000 UTC m=+289.639488199" watchObservedRunningTime="2026-02-16 21:42:49.081594412 +0000 UTC m=+289.664095554" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.153896 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.275552 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.502898 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.564359 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.641926 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 16 21:42:49 crc kubenswrapper[4777]: I0216 21:42:49.811155 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.119182 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.225236 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.259515 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.343068 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.463924 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 16 21:42:50 crc kubenswrapper[4777]: I0216 21:42:50.719966 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 16 21:42:51 crc kubenswrapper[4777]: I0216 21:42:51.132367 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 16 21:42:51 crc kubenswrapper[4777]: I0216 21:42:51.194384 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 16 21:42:51 crc kubenswrapper[4777]: I0216 21:42:51.197828 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 16 21:42:51 crc kubenswrapper[4777]: I0216 21:42:51.258219 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 16 21:42:52 crc kubenswrapper[4777]: I0216 21:42:52.771672 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.051040 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.051508 4777 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f3e389f78403ebe8b0abb95cf04041926a80fc1e389a87164f235f7e510a758d" exitCode=137 Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.051583 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89b3368c7952919f8870ef3a6cd83962f7ceda12509f57b6082f27486e5fb92" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.091115 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.091263 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259552 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259650 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259755 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259761 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259794 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259801 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259859 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259874 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.259841 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.260782 4777 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.260829 4777 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.260849 4777 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.260871 4777 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.271883 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.362289 4777 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.825344 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 16 21:42:53 crc kubenswrapper[4777]: I0216 21:42:53.825460 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.059319 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.193059 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.193562 4777 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.208695 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.208783 4777 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1681d77f-7e42-42d9-8f59-eaf3a5c993da" Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.216803 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 16 21:42:54 crc kubenswrapper[4777]: I0216 21:42:54.216883 4777 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1681d77f-7e42-42d9-8f59-eaf3a5c993da" Feb 16 21:42:59 crc kubenswrapper[4777]: I0216 21:42:59.958184 4777 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 16 21:43:09 crc kubenswrapper[4777]: I0216 21:43:09.183663 4777 generic.go:334] "Generic (PLEG): container finished" podID="acf589f2-0447-4431-92cc-73956a345b44" containerID="d04996da245f1a942dff21e11727be6dd4105be864af59a55c7bc6ef30077764" exitCode=0 Feb 16 21:43:09 crc kubenswrapper[4777]: I0216 21:43:09.183794 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerDied","Data":"d04996da245f1a942dff21e11727be6dd4105be864af59a55c7bc6ef30077764"} Feb 16 21:43:09 crc kubenswrapper[4777]: I0216 21:43:09.185301 4777 scope.go:117] "RemoveContainer" containerID="d04996da245f1a942dff21e11727be6dd4105be864af59a55c7bc6ef30077764" Feb 16 21:43:10 crc kubenswrapper[4777]: I0216 21:43:10.198376 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerStarted","Data":"261fac8202ca645dbfcd699349752872862546da3c33e57b7676ee1e201250d3"} Feb 16 21:43:10 crc kubenswrapper[4777]: I0216 21:43:10.200130 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:43:10 crc kubenswrapper[4777]: I0216 21:43:10.202481 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.251915 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.257485 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.257574 4777 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="eddc5314f32b82f4be0c45c15f240220162b0d5b5bf593cbf8c7f7b42a2b8616" exitCode=137 Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.257627 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"eddc5314f32b82f4be0c45c15f240220162b0d5b5bf593cbf8c7f7b42a2b8616"} Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.257674 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4b2c64f54cde3614eaf7ab66724d09aa0b79b530726d72d458c98bfe24d19e36"} Feb 16 21:43:17 crc kubenswrapper[4777]: I0216 21:43:17.257709 4777 scope.go:117] "RemoveContainer" containerID="ccb8ba0d8458cc8d1ca40242b9d4c8c45e83ea7d92e8b99b4b0d70172614a0ce" Feb 16 21:43:18 crc kubenswrapper[4777]: I0216 21:43:18.268335 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 16 21:43:18 crc kubenswrapper[4777]: I0216 21:43:18.865109 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:43:26 crc kubenswrapper[4777]: I0216 21:43:26.786165 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:43:26 crc kubenswrapper[4777]: I0216 21:43:26.796209 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:43:27 crc kubenswrapper[4777]: I0216 21:43:27.345357 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 16 21:43:41 crc kubenswrapper[4777]: I0216 21:43:41.651956 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:43:41 crc kubenswrapper[4777]: I0216 21:43:41.652664 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.455185 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.455474 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerName="controller-manager" containerID="cri-o://185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d" gracePeriod=30 Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.543577 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.543910 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerName="route-controller-manager" containerID="cri-o://6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e" gracePeriod=30 Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.859637 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:43:42 crc kubenswrapper[4777]: I0216 21:43:42.910765 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039011 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5bjg\" (UniqueName: \"kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg\") pod \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039086 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca\") pod \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039119 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert\") pod \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039152 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlnv4\" (UniqueName: \"kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4\") pod \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039187 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca\") pod \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039220 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles\") pod \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039268 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config\") pod \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039341 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert\") pod \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\" (UID: \"ca8070e1-ca1b-4716-9e12-c72f37e21f95\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.039379 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config\") pod \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\" (UID: \"63641d03-7d3e-4b50-a5da-722b2b5b97c2\") " Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.040857 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config" (OuterVolumeSpecName: "config") pod "63641d03-7d3e-4b50-a5da-722b2b5b97c2" (UID: "63641d03-7d3e-4b50-a5da-722b2b5b97c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.042762 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "63641d03-7d3e-4b50-a5da-722b2b5b97c2" (UID: "63641d03-7d3e-4b50-a5da-722b2b5b97c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.042976 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca" (OuterVolumeSpecName: "client-ca") pod "ca8070e1-ca1b-4716-9e12-c72f37e21f95" (UID: "ca8070e1-ca1b-4716-9e12-c72f37e21f95"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.043298 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config" (OuterVolumeSpecName: "config") pod "ca8070e1-ca1b-4716-9e12-c72f37e21f95" (UID: "ca8070e1-ca1b-4716-9e12-c72f37e21f95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.043312 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ca8070e1-ca1b-4716-9e12-c72f37e21f95" (UID: "ca8070e1-ca1b-4716-9e12-c72f37e21f95"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.048792 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ca8070e1-ca1b-4716-9e12-c72f37e21f95" (UID: "ca8070e1-ca1b-4716-9e12-c72f37e21f95"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.049000 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4" (OuterVolumeSpecName: "kube-api-access-jlnv4") pod "63641d03-7d3e-4b50-a5da-722b2b5b97c2" (UID: "63641d03-7d3e-4b50-a5da-722b2b5b97c2"). InnerVolumeSpecName "kube-api-access-jlnv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.049042 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63641d03-7d3e-4b50-a5da-722b2b5b97c2" (UID: "63641d03-7d3e-4b50-a5da-722b2b5b97c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.049945 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg" (OuterVolumeSpecName: "kube-api-access-s5bjg") pod "ca8070e1-ca1b-4716-9e12-c72f37e21f95" (UID: "ca8070e1-ca1b-4716-9e12-c72f37e21f95"). InnerVolumeSpecName "kube-api-access-s5bjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141222 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141246 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8070e1-ca1b-4716-9e12-c72f37e21f95-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141256 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141264 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5bjg\" (UniqueName: \"kubernetes.io/projected/ca8070e1-ca1b-4716-9e12-c72f37e21f95-kube-api-access-s5bjg\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141275 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141283 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63641d03-7d3e-4b50-a5da-722b2b5b97c2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141291 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlnv4\" (UniqueName: \"kubernetes.io/projected/63641d03-7d3e-4b50-a5da-722b2b5b97c2-kube-api-access-jlnv4\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141299 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63641d03-7d3e-4b50-a5da-722b2b5b97c2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.141307 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca8070e1-ca1b-4716-9e12-c72f37e21f95-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.558856 4777 generic.go:334] "Generic (PLEG): container finished" podID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerID="6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e" exitCode=0 Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.558970 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.558967 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" event={"ID":"63641d03-7d3e-4b50-a5da-722b2b5b97c2","Type":"ContainerDied","Data":"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e"} Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.559357 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs" event={"ID":"63641d03-7d3e-4b50-a5da-722b2b5b97c2","Type":"ContainerDied","Data":"da4a622d10d6d690486a934f799a9bfcc310f7bfd9ec9cd4ed75f38b66ac87a3"} Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.559398 4777 scope.go:117] "RemoveContainer" containerID="6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.564974 4777 generic.go:334] "Generic (PLEG): container finished" podID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerID="185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d" exitCode=0 Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.565027 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" event={"ID":"ca8070e1-ca1b-4716-9e12-c72f37e21f95","Type":"ContainerDied","Data":"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d"} Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.565070 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" event={"ID":"ca8070e1-ca1b-4716-9e12-c72f37e21f95","Type":"ContainerDied","Data":"19f0d9b6e80eb6c01d1e140f35a5f9153c1bc721e45b9659d51cb6076511f5ee"} Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.565120 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sx74s" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.603041 4777 scope.go:117] "RemoveContainer" containerID="6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e" Feb 16 21:43:43 crc kubenswrapper[4777]: E0216 21:43:43.603840 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e\": container with ID starting with 6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e not found: ID does not exist" containerID="6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.604247 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e"} err="failed to get container status \"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e\": rpc error: code = NotFound desc = could not find container \"6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e\": container with ID starting with 6ff55fe9225a9462c22f452e7831c5158d378c3e490b03963a8cae0eadba3c2e not found: ID does not exist" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.604310 4777 scope.go:117] "RemoveContainer" containerID="185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.617890 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.622091 4777 scope.go:117] "RemoveContainer" containerID="185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d" Feb 16 21:43:43 crc kubenswrapper[4777]: E0216 21:43:43.625741 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d\": container with ID starting with 185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d not found: ID does not exist" containerID="185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.625810 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d"} err="failed to get container status \"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d\": rpc error: code = NotFound desc = could not find container \"185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d\": container with ID starting with 185913672a0c1de146e19574deb03260bcdf6f5a9659d6e5c8c33aad7f1a652d not found: ID does not exist" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.627474 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sx74s"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.634002 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.643349 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7kqgs"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706263 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:43 crc kubenswrapper[4777]: E0216 21:43:43.706680 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706695 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 21:43:43 crc kubenswrapper[4777]: E0216 21:43:43.706707 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerName="route-controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706728 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerName="route-controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: E0216 21:43:43.706741 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerName="controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706748 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerName="controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706859 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706869 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" containerName="controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.706880 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" containerName="route-controller-manager" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.707749 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.710208 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.710404 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.710658 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.710738 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.710913 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.712588 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.716338 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.717118 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.719847 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.720188 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.720498 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.720619 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.720806 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.720977 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.721963 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.723590 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.728094 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.755740 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.755948 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756073 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756291 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs7h6\" (UniqueName: \"kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756424 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756532 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756648 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756777 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdcx\" (UniqueName: \"kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.756907 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857579 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdcx\" (UniqueName: \"kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857653 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857690 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857744 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857769 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs7h6\" (UniqueName: \"kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857832 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857855 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.857900 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.859686 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.860104 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.862001 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.863121 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.864401 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.865265 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.865454 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.882300 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdcx\" (UniqueName: \"kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx\") pod \"controller-manager-77dd5df966-tgvrn\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:43 crc kubenswrapper[4777]: I0216 21:43:43.885971 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs7h6\" (UniqueName: \"kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6\") pod \"route-controller-manager-7b9cdbf977-94z6n\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.065596 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.076451 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.196417 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63641d03-7d3e-4b50-a5da-722b2b5b97c2" path="/var/lib/kubelet/pods/63641d03-7d3e-4b50-a5da-722b2b5b97c2/volumes" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.198040 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8070e1-ca1b-4716-9e12-c72f37e21f95" path="/var/lib/kubelet/pods/ca8070e1-ca1b-4716-9e12-c72f37e21f95/volumes" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.320321 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.352295 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:44 crc kubenswrapper[4777]: W0216 21:43:44.359397 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34644087_e705_4790_83d2_b73fde1b73d3.slice/crio-67d56a1efe66c931420c0b221176a4b83a22050d683505ff4e2d652af5dac6b9 WatchSource:0}: Error finding container 67d56a1efe66c931420c0b221176a4b83a22050d683505ff4e2d652af5dac6b9: Status 404 returned error can't find the container with id 67d56a1efe66c931420c0b221176a4b83a22050d683505ff4e2d652af5dac6b9 Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.577558 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" event={"ID":"34644087-e705-4790-83d2-b73fde1b73d3","Type":"ContainerStarted","Data":"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50"} Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.577976 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" event={"ID":"34644087-e705-4790-83d2-b73fde1b73d3","Type":"ContainerStarted","Data":"67d56a1efe66c931420c0b221176a4b83a22050d683505ff4e2d652af5dac6b9"} Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.578461 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.579995 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" event={"ID":"00f0f8a8-eed2-48e8-a5f4-c77988c7d685","Type":"ContainerStarted","Data":"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53"} Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.580035 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" event={"ID":"00f0f8a8-eed2-48e8-a5f4-c77988c7d685","Type":"ContainerStarted","Data":"a72fb6ba161a80c9575bb59637dd62e9b5731f9b32136f68ca698a8a237f5ede"} Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.580605 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.587285 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.594465 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" podStartSLOduration=2.594448769 podStartE2EDuration="2.594448769s" podCreationTimestamp="2026-02-16 21:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:43:44.594081502 +0000 UTC m=+345.176582614" watchObservedRunningTime="2026-02-16 21:43:44.594448769 +0000 UTC m=+345.176949861" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.613397 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" podStartSLOduration=2.613372696 podStartE2EDuration="2.613372696s" podCreationTimestamp="2026-02-16 21:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:43:44.612316635 +0000 UTC m=+345.194817747" watchObservedRunningTime="2026-02-16 21:43:44.613372696 +0000 UTC m=+345.195873818" Feb 16 21:43:44 crc kubenswrapper[4777]: I0216 21:43:44.830157 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:51 crc kubenswrapper[4777]: I0216 21:43:51.807321 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:51 crc kubenswrapper[4777]: I0216 21:43:51.808142 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" podUID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" containerName="controller-manager" containerID="cri-o://c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53" gracePeriod=30 Feb 16 21:43:51 crc kubenswrapper[4777]: I0216 21:43:51.816881 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:51 crc kubenswrapper[4777]: I0216 21:43:51.817170 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" podUID="34644087-e705-4790-83d2-b73fde1b73d3" containerName="route-controller-manager" containerID="cri-o://78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50" gracePeriod=30 Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.305425 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.378785 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca\") pod \"34644087-e705-4790-83d2-b73fde1b73d3\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.378934 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert\") pod \"34644087-e705-4790-83d2-b73fde1b73d3\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.378997 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs7h6\" (UniqueName: \"kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6\") pod \"34644087-e705-4790-83d2-b73fde1b73d3\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.379111 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config\") pod \"34644087-e705-4790-83d2-b73fde1b73d3\" (UID: \"34644087-e705-4790-83d2-b73fde1b73d3\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.380106 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "34644087-e705-4790-83d2-b73fde1b73d3" (UID: "34644087-e705-4790-83d2-b73fde1b73d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.380137 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config" (OuterVolumeSpecName: "config") pod "34644087-e705-4790-83d2-b73fde1b73d3" (UID: "34644087-e705-4790-83d2-b73fde1b73d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.387787 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34644087-e705-4790-83d2-b73fde1b73d3" (UID: "34644087-e705-4790-83d2-b73fde1b73d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.388655 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6" (OuterVolumeSpecName: "kube-api-access-bs7h6") pod "34644087-e705-4790-83d2-b73fde1b73d3" (UID: "34644087-e705-4790-83d2-b73fde1b73d3"). InnerVolumeSpecName "kube-api-access-bs7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.435603 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.480553 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca\") pod \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.480605 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config\") pod \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.480673 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert\") pod \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.480697 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles\") pod \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.480737 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kdcx\" (UniqueName: \"kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx\") pod \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\" (UID: \"00f0f8a8-eed2-48e8-a5f4-c77988c7d685\") " Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.481010 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34644087-e705-4790-83d2-b73fde1b73d3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.481023 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs7h6\" (UniqueName: \"kubernetes.io/projected/34644087-e705-4790-83d2-b73fde1b73d3-kube-api-access-bs7h6\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.481035 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.481044 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34644087-e705-4790-83d2-b73fde1b73d3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.481865 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "00f0f8a8-eed2-48e8-a5f4-c77988c7d685" (UID: "00f0f8a8-eed2-48e8-a5f4-c77988c7d685"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.482170 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config" (OuterVolumeSpecName: "config") pod "00f0f8a8-eed2-48e8-a5f4-c77988c7d685" (UID: "00f0f8a8-eed2-48e8-a5f4-c77988c7d685"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.482186 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca" (OuterVolumeSpecName: "client-ca") pod "00f0f8a8-eed2-48e8-a5f4-c77988c7d685" (UID: "00f0f8a8-eed2-48e8-a5f4-c77988c7d685"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.485838 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00f0f8a8-eed2-48e8-a5f4-c77988c7d685" (UID: "00f0f8a8-eed2-48e8-a5f4-c77988c7d685"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.485859 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx" (OuterVolumeSpecName: "kube-api-access-7kdcx") pod "00f0f8a8-eed2-48e8-a5f4-c77988c7d685" (UID: "00f0f8a8-eed2-48e8-a5f4-c77988c7d685"). InnerVolumeSpecName "kube-api-access-7kdcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.581635 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.581670 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.581684 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kdcx\" (UniqueName: \"kubernetes.io/projected/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-kube-api-access-7kdcx\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.581694 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.581704 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f0f8a8-eed2-48e8-a5f4-c77988c7d685-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.646970 4777 generic.go:334] "Generic (PLEG): container finished" podID="34644087-e705-4790-83d2-b73fde1b73d3" containerID="78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50" exitCode=0 Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.647069 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" event={"ID":"34644087-e705-4790-83d2-b73fde1b73d3","Type":"ContainerDied","Data":"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50"} Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.647165 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" event={"ID":"34644087-e705-4790-83d2-b73fde1b73d3","Type":"ContainerDied","Data":"67d56a1efe66c931420c0b221176a4b83a22050d683505ff4e2d652af5dac6b9"} Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.647087 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.647213 4777 scope.go:117] "RemoveContainer" containerID="78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.652349 4777 generic.go:334] "Generic (PLEG): container finished" podID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" containerID="c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53" exitCode=0 Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.652394 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" event={"ID":"00f0f8a8-eed2-48e8-a5f4-c77988c7d685","Type":"ContainerDied","Data":"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53"} Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.652424 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" event={"ID":"00f0f8a8-eed2-48e8-a5f4-c77988c7d685","Type":"ContainerDied","Data":"a72fb6ba161a80c9575bb59637dd62e9b5731f9b32136f68ca698a8a237f5ede"} Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.652485 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd5df966-tgvrn" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.672413 4777 scope.go:117] "RemoveContainer" containerID="78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50" Feb 16 21:43:52 crc kubenswrapper[4777]: E0216 21:43:52.673335 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50\": container with ID starting with 78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50 not found: ID does not exist" containerID="78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.673507 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50"} err="failed to get container status \"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50\": rpc error: code = NotFound desc = could not find container \"78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50\": container with ID starting with 78589ddeea3144df0c6feb2959bcf9ff9ef4332d47df699428d2d18d300c1b50 not found: ID does not exist" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.673656 4777 scope.go:117] "RemoveContainer" containerID="c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.682393 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.694513 4777 scope.go:117] "RemoveContainer" containerID="c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53" Feb 16 21:43:52 crc kubenswrapper[4777]: E0216 21:43:52.695146 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53\": container with ID starting with c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53 not found: ID does not exist" containerID="c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.695204 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53"} err="failed to get container status \"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53\": rpc error: code = NotFound desc = could not find container \"c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53\": container with ID starting with c72d27e5cae4a0d1b8fdc6f0d4a3f70b295b59cffd3aeda687ead61016371e53 not found: ID does not exist" Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.706189 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9cdbf977-94z6n"] Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.714077 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:52 crc kubenswrapper[4777]: I0216 21:43:52.720945 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77dd5df966-tgvrn"] Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.723343 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:43:53 crc kubenswrapper[4777]: E0216 21:43:53.724440 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34644087-e705-4790-83d2-b73fde1b73d3" containerName="route-controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.724528 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="34644087-e705-4790-83d2-b73fde1b73d3" containerName="route-controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: E0216 21:43:53.725604 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" containerName="controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.725632 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" containerName="controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.726242 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="34644087-e705-4790-83d2-b73fde1b73d3" containerName="route-controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.726316 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" containerName="controller-manager" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.727544 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.730031 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.735255 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.735278 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.736783 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.736891 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.737189 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.737375 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.737876 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.737943 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.738418 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.738513 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.739167 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.739376 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.745662 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.748280 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.750325 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.755466 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801378 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801474 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6nhx\" (UniqueName: \"kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801526 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gld5\" (UniqueName: \"kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801568 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801605 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801640 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801690 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801772 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.801995 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903358 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903485 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903576 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903686 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903775 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6nhx\" (UniqueName: \"kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903850 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gld5\" (UniqueName: \"kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903902 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.903959 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.904008 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.904878 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.905306 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.905470 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.906541 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.907829 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.909892 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.910568 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.930679 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6nhx\" (UniqueName: \"kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx\") pod \"route-controller-manager-64886dcbf7-5v5vq\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:53 crc kubenswrapper[4777]: I0216 21:43:53.935380 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gld5\" (UniqueName: \"kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5\") pod \"controller-manager-5877bb7bbb-c4mzm\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.067361 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.079153 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.199200 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f0f8a8-eed2-48e8-a5f4-c77988c7d685" path="/var/lib/kubelet/pods/00f0f8a8-eed2-48e8-a5f4-c77988c7d685/volumes" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.203788 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34644087-e705-4790-83d2-b73fde1b73d3" path="/var/lib/kubelet/pods/34644087-e705-4790-83d2-b73fde1b73d3/volumes" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.389272 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.523478 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:43:54 crc kubenswrapper[4777]: W0216 21:43:54.530079 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d934eea_9ad7_4cf6_ae36_6437958a17f7.slice/crio-432e11546f350dbaa7aaa6e92c56e3fefef7ff1fb38f0a8aa9903d32306e4030 WatchSource:0}: Error finding container 432e11546f350dbaa7aaa6e92c56e3fefef7ff1fb38f0a8aa9903d32306e4030: Status 404 returned error can't find the container with id 432e11546f350dbaa7aaa6e92c56e3fefef7ff1fb38f0a8aa9903d32306e4030 Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.671738 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" event={"ID":"bf26cbc8-49fb-4be1-be65-ea326d410411","Type":"ContainerStarted","Data":"becc70a76086862cf450e6a8b4799bac0b2d43561c7afcaa32ad841b9dfc33b5"} Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.671982 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" event={"ID":"bf26cbc8-49fb-4be1-be65-ea326d410411","Type":"ContainerStarted","Data":"3db64eaa72c4d22386ae015e7fd4895fd429653a8e8bab6888cecdbcb360fa3e"} Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.672205 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.675079 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" event={"ID":"7d934eea-9ad7-4cf6-ae36-6437958a17f7","Type":"ContainerStarted","Data":"71e22be53615a8132f4ef99427187cc501b7aef071b14691424859d20099d41f"} Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.675133 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" event={"ID":"7d934eea-9ad7-4cf6-ae36-6437958a17f7","Type":"ContainerStarted","Data":"432e11546f350dbaa7aaa6e92c56e3fefef7ff1fb38f0a8aa9903d32306e4030"} Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.675863 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.677177 4777 patch_prober.go:28] interesting pod/route-controller-manager-64886dcbf7-5v5vq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.677229 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.682421 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.698641 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" podStartSLOduration=3.698615769 podStartE2EDuration="3.698615769s" podCreationTimestamp="2026-02-16 21:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:43:54.692539688 +0000 UTC m=+355.275040800" watchObservedRunningTime="2026-02-16 21:43:54.698615769 +0000 UTC m=+355.281116891" Feb 16 21:43:54 crc kubenswrapper[4777]: I0216 21:43:54.750506 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" podStartSLOduration=3.750481652 podStartE2EDuration="3.750481652s" podCreationTimestamp="2026-02-16 21:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:43:54.749024313 +0000 UTC m=+355.331525415" watchObservedRunningTime="2026-02-16 21:43:54.750481652 +0000 UTC m=+355.332982754" Feb 16 21:43:55 crc kubenswrapper[4777]: I0216 21:43:55.688551 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.459079 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.459942 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" podUID="bf26cbc8-49fb-4be1-be65-ea326d410411" containerName="controller-manager" containerID="cri-o://becc70a76086862cf450e6a8b4799bac0b2d43561c7afcaa32ad841b9dfc33b5" gracePeriod=30 Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.482616 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.483381 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerName="route-controller-manager" containerID="cri-o://71e22be53615a8132f4ef99427187cc501b7aef071b14691424859d20099d41f" gracePeriod=30 Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.746533 4777 generic.go:334] "Generic (PLEG): container finished" podID="bf26cbc8-49fb-4be1-be65-ea326d410411" containerID="becc70a76086862cf450e6a8b4799bac0b2d43561c7afcaa32ad841b9dfc33b5" exitCode=0 Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.746615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" event={"ID":"bf26cbc8-49fb-4be1-be65-ea326d410411","Type":"ContainerDied","Data":"becc70a76086862cf450e6a8b4799bac0b2d43561c7afcaa32ad841b9dfc33b5"} Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.748445 4777 generic.go:334] "Generic (PLEG): container finished" podID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerID="71e22be53615a8132f4ef99427187cc501b7aef071b14691424859d20099d41f" exitCode=0 Feb 16 21:44:04 crc kubenswrapper[4777]: I0216 21:44:04.748467 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" event={"ID":"7d934eea-9ad7-4cf6-ae36-6437958a17f7","Type":"ContainerDied","Data":"71e22be53615a8132f4ef99427187cc501b7aef071b14691424859d20099d41f"} Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.038643 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.048442 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.073382 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config\") pod \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.073461 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert\") pod \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.073491 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca\") pod \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.073573 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6nhx\" (UniqueName: \"kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx\") pod \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\" (UID: \"7d934eea-9ad7-4cf6-ae36-6437958a17f7\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.074666 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d934eea-9ad7-4cf6-ae36-6437958a17f7" (UID: "7d934eea-9ad7-4cf6-ae36-6437958a17f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.074775 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config" (OuterVolumeSpecName: "config") pod "7d934eea-9ad7-4cf6-ae36-6437958a17f7" (UID: "7d934eea-9ad7-4cf6-ae36-6437958a17f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.079838 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx" (OuterVolumeSpecName: "kube-api-access-s6nhx") pod "7d934eea-9ad7-4cf6-ae36-6437958a17f7" (UID: "7d934eea-9ad7-4cf6-ae36-6437958a17f7"). InnerVolumeSpecName "kube-api-access-s6nhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.079855 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d934eea-9ad7-4cf6-ae36-6437958a17f7" (UID: "7d934eea-9ad7-4cf6-ae36-6437958a17f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.175261 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gld5\" (UniqueName: \"kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5\") pod \"bf26cbc8-49fb-4be1-be65-ea326d410411\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.175733 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert\") pod \"bf26cbc8-49fb-4be1-be65-ea326d410411\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.175844 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config\") pod \"bf26cbc8-49fb-4be1-be65-ea326d410411\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.175947 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca\") pod \"bf26cbc8-49fb-4be1-be65-ea326d410411\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.176554 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles\") pod \"bf26cbc8-49fb-4be1-be65-ea326d410411\" (UID: \"bf26cbc8-49fb-4be1-be65-ea326d410411\") " Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.176928 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d934eea-9ad7-4cf6-ae36-6437958a17f7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.177021 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.177091 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6nhx\" (UniqueName: \"kubernetes.io/projected/7d934eea-9ad7-4cf6-ae36-6437958a17f7-kube-api-access-s6nhx\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.177157 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d934eea-9ad7-4cf6-ae36-6437958a17f7-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.176588 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf26cbc8-49fb-4be1-be65-ea326d410411" (UID: "bf26cbc8-49fb-4be1-be65-ea326d410411"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.176650 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config" (OuterVolumeSpecName: "config") pod "bf26cbc8-49fb-4be1-be65-ea326d410411" (UID: "bf26cbc8-49fb-4be1-be65-ea326d410411"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.177025 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bf26cbc8-49fb-4be1-be65-ea326d410411" (UID: "bf26cbc8-49fb-4be1-be65-ea326d410411"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.179925 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf26cbc8-49fb-4be1-be65-ea326d410411" (UID: "bf26cbc8-49fb-4be1-be65-ea326d410411"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.179932 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5" (OuterVolumeSpecName: "kube-api-access-4gld5") pod "bf26cbc8-49fb-4be1-be65-ea326d410411" (UID: "bf26cbc8-49fb-4be1-be65-ea326d410411"). InnerVolumeSpecName "kube-api-access-4gld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.278203 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gld5\" (UniqueName: \"kubernetes.io/projected/bf26cbc8-49fb-4be1-be65-ea326d410411-kube-api-access-4gld5\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.278252 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf26cbc8-49fb-4be1-be65-ea326d410411-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.278268 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.278281 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.278292 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf26cbc8-49fb-4be1-be65-ea326d410411-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.722623 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:05 crc kubenswrapper[4777]: E0216 21:44:05.723138 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf26cbc8-49fb-4be1-be65-ea326d410411" containerName="controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.723163 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf26cbc8-49fb-4be1-be65-ea326d410411" containerName="controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: E0216 21:44:05.723189 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerName="route-controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.723202 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerName="route-controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.723352 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf26cbc8-49fb-4be1-be65-ea326d410411" containerName="controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.723391 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" containerName="route-controller-manager" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.724268 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.728754 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.729912 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.737433 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.744507 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.759894 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" event={"ID":"bf26cbc8-49fb-4be1-be65-ea326d410411","Type":"ContainerDied","Data":"3db64eaa72c4d22386ae015e7fd4895fd429653a8e8bab6888cecdbcb360fa3e"} Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.759970 4777 scope.go:117] "RemoveContainer" containerID="becc70a76086862cf450e6a8b4799bac0b2d43561c7afcaa32ad841b9dfc33b5" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.760156 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.764874 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" event={"ID":"7d934eea-9ad7-4cf6-ae36-6437958a17f7","Type":"ContainerDied","Data":"432e11546f350dbaa7aaa6e92c56e3fefef7ff1fb38f0a8aa9903d32306e4030"} Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.765025 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784162 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784255 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht2ds\" (UniqueName: \"kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784303 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f79\" (UniqueName: \"kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784431 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784582 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784667 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784705 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.784990 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.785080 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.804410 4777 scope.go:117] "RemoveContainer" containerID="71e22be53615a8132f4ef99427187cc501b7aef071b14691424859d20099d41f" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.827869 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.832515 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-c4mzm"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.845634 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.853028 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-5v5vq"] Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886524 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886593 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht2ds\" (UniqueName: \"kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886632 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f79\" (UniqueName: \"kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886653 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886682 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886704 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886751 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886798 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.886830 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.887843 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.887914 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.888125 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.888378 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.889213 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.892090 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.894617 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.906656 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht2ds\" (UniqueName: \"kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds\") pod \"controller-manager-6889d7b855-4g2ht\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:05 crc kubenswrapper[4777]: I0216 21:44:05.914812 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f79\" (UniqueName: \"kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79\") pod \"route-controller-manager-57cffcc444-7k8wj\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.053006 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.086621 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.223196 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d934eea-9ad7-4cf6-ae36-6437958a17f7" path="/var/lib/kubelet/pods/7d934eea-9ad7-4cf6-ae36-6437958a17f7/volumes" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.224185 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf26cbc8-49fb-4be1-be65-ea326d410411" path="/var/lib/kubelet/pods/bf26cbc8-49fb-4be1-be65-ea326d410411/volumes" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.398204 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.474581 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:06 crc kubenswrapper[4777]: W0216 21:44:06.482529 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda461874_eb45_43eb_8878_1647b74378c3.slice/crio-c298a38be64ba244f77e3ee227fdf7b02ff9cd940ca7522a5a67500a6980afe5 WatchSource:0}: Error finding container c298a38be64ba244f77e3ee227fdf7b02ff9cd940ca7522a5a67500a6980afe5: Status 404 returned error can't find the container with id c298a38be64ba244f77e3ee227fdf7b02ff9cd940ca7522a5a67500a6980afe5 Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.773753 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" event={"ID":"598f437c-cb76-4c3a-88ef-459dbdc705b7","Type":"ContainerStarted","Data":"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202"} Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.773809 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" event={"ID":"598f437c-cb76-4c3a-88ef-459dbdc705b7","Type":"ContainerStarted","Data":"21bfe901450a4b9cf2efde1f7dee903cc3f86d13b20748149ec2520d4401fbef"} Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.773839 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.775696 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" event={"ID":"da461874-eb45-43eb-8878-1647b74378c3","Type":"ContainerStarted","Data":"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64"} Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.775795 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.775816 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" event={"ID":"da461874-eb45-43eb-8878-1647b74378c3","Type":"ContainerStarted","Data":"c298a38be64ba244f77e3ee227fdf7b02ff9cd940ca7522a5a67500a6980afe5"} Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.794639 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.804021 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" podStartSLOduration=2.804000066 podStartE2EDuration="2.804000066s" podCreationTimestamp="2026-02-16 21:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:06.801775352 +0000 UTC m=+367.384276474" watchObservedRunningTime="2026-02-16 21:44:06.804000066 +0000 UTC m=+367.386501168" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.831554 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" podStartSLOduration=2.831533624 podStartE2EDuration="2.831533624s" podCreationTimestamp="2026-02-16 21:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:06.831399442 +0000 UTC m=+367.413900544" watchObservedRunningTime="2026-02-16 21:44:06.831533624 +0000 UTC m=+367.414034726" Feb 16 21:44:06 crc kubenswrapper[4777]: I0216 21:44:06.895958 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:11 crc kubenswrapper[4777]: I0216 21:44:11.651322 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:44:11 crc kubenswrapper[4777]: I0216 21:44:11.651807 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.433662 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.434743 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" podUID="598f437c-cb76-4c3a-88ef-459dbdc705b7" containerName="route-controller-manager" containerID="cri-o://30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202" gracePeriod=30 Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.902075 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.902201 4777 generic.go:334] "Generic (PLEG): container finished" podID="598f437c-cb76-4c3a-88ef-459dbdc705b7" containerID="30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202" exitCode=0 Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.902242 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" event={"ID":"598f437c-cb76-4c3a-88ef-459dbdc705b7","Type":"ContainerDied","Data":"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202"} Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.902694 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" event={"ID":"598f437c-cb76-4c3a-88ef-459dbdc705b7","Type":"ContainerDied","Data":"21bfe901450a4b9cf2efde1f7dee903cc3f86d13b20748149ec2520d4401fbef"} Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.902749 4777 scope.go:117] "RemoveContainer" containerID="30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.933084 4777 scope.go:117] "RemoveContainer" containerID="30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202" Feb 16 21:44:22 crc kubenswrapper[4777]: E0216 21:44:22.933683 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202\": container with ID starting with 30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202 not found: ID does not exist" containerID="30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.933742 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202"} err="failed to get container status \"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202\": rpc error: code = NotFound desc = could not find container \"30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202\": container with ID starting with 30811a86a90385dc6c0fa3db4e28bb3e306125d3c86b187a8fcf5b73082f7202 not found: ID does not exist" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.940329 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config\") pod \"598f437c-cb76-4c3a-88ef-459dbdc705b7\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.940436 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4f79\" (UniqueName: \"kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79\") pod \"598f437c-cb76-4c3a-88ef-459dbdc705b7\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.940505 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca\") pod \"598f437c-cb76-4c3a-88ef-459dbdc705b7\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.940534 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert\") pod \"598f437c-cb76-4c3a-88ef-459dbdc705b7\" (UID: \"598f437c-cb76-4c3a-88ef-459dbdc705b7\") " Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.941526 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "598f437c-cb76-4c3a-88ef-459dbdc705b7" (UID: "598f437c-cb76-4c3a-88ef-459dbdc705b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.941608 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config" (OuterVolumeSpecName: "config") pod "598f437c-cb76-4c3a-88ef-459dbdc705b7" (UID: "598f437c-cb76-4c3a-88ef-459dbdc705b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.947644 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "598f437c-cb76-4c3a-88ef-459dbdc705b7" (UID: "598f437c-cb76-4c3a-88ef-459dbdc705b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:44:22 crc kubenswrapper[4777]: I0216 21:44:22.948971 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79" (OuterVolumeSpecName: "kube-api-access-k4f79") pod "598f437c-cb76-4c3a-88ef-459dbdc705b7" (UID: "598f437c-cb76-4c3a-88ef-459dbdc705b7"). InnerVolumeSpecName "kube-api-access-k4f79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.042078 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4f79\" (UniqueName: \"kubernetes.io/projected/598f437c-cb76-4c3a-88ef-459dbdc705b7-kube-api-access-k4f79\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.042124 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.042140 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/598f437c-cb76-4c3a-88ef-459dbdc705b7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.042157 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/598f437c-cb76-4c3a-88ef-459dbdc705b7-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.408648 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.410687 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nrsdv" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="registry-server" containerID="cri-o://629b899e1d5dc8bcb2a60f4b27ef8e0f5a975fd73c9ce40270f2f460998b0e8c" gracePeriod=30 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.418130 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.418460 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wzxjq" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="registry-server" containerID="cri-o://fe789c04e2a9539ab8e6cc106f96e70ef884fca95069b6136608e6a240fc54e0" gracePeriod=30 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.431326 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.435464 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" containerID="cri-o://261fac8202ca645dbfcd699349752872862546da3c33e57b7676ee1e201250d3" gracePeriod=30 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.439605 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.446075 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k68t2" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="registry-server" containerID="cri-o://b9e01d6bf7484354224c17867982d582bfa0ed5182c2f9099ce1af9f984441a6" gracePeriod=30 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.448257 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58xj7"] Feb 16 21:44:23 crc kubenswrapper[4777]: E0216 21:44:23.448515 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598f437c-cb76-4c3a-88ef-459dbdc705b7" containerName="route-controller-manager" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.448528 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="598f437c-cb76-4c3a-88ef-459dbdc705b7" containerName="route-controller-manager" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.448628 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="598f437c-cb76-4c3a-88ef-459dbdc705b7" containerName="route-controller-manager" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.449093 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.453409 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.453748 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b76h6" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="registry-server" containerID="cri-o://63ba03eead54b2094492589d7645f4e4f852c1b89093cf0c4b209fd11e63cdc9" gracePeriod=30 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.469172 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58xj7"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.547452 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.547533 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.547576 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkc9\" (UniqueName: \"kubernetes.io/projected/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-kube-api-access-7nkc9\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.649645 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.649725 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.649761 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkc9\" (UniqueName: \"kubernetes.io/projected/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-kube-api-access-7nkc9\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.651740 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.660594 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.668184 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkc9\" (UniqueName: \"kubernetes.io/projected/f5014a8f-fe3e-404c-a85c-6062fd0e76f7-kube-api-access-7nkc9\") pod \"marketplace-operator-79b997595-58xj7\" (UID: \"f5014a8f-fe3e-404c-a85c-6062fd0e76f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.731110 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.733500 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.744814 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.851209 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-client-ca\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.851290 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-config\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.851349 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38622dfb-ce12-4e38-a718-2b89e3dca594-serving-cert\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.851490 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlpw\" (UniqueName: \"kubernetes.io/projected/38622dfb-ce12-4e38-a718-2b89e3dca594-kube-api-access-mxlpw\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.865343 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.911783 4777 generic.go:334] "Generic (PLEG): container finished" podID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerID="63ba03eead54b2094492589d7645f4e4f852c1b89093cf0c4b209fd11e63cdc9" exitCode=0 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.911879 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerDied","Data":"63ba03eead54b2094492589d7645f4e4f852c1b89093cf0c4b209fd11e63cdc9"} Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.915295 4777 generic.go:334] "Generic (PLEG): container finished" podID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerID="629b899e1d5dc8bcb2a60f4b27ef8e0f5a975fd73c9ce40270f2f460998b0e8c" exitCode=0 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.915392 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerDied","Data":"629b899e1d5dc8bcb2a60f4b27ef8e0f5a975fd73c9ce40270f2f460998b0e8c"} Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.919909 4777 generic.go:334] "Generic (PLEG): container finished" podID="01c46266-f08a-405d-8768-8075a15bb61d" containerID="b9e01d6bf7484354224c17867982d582bfa0ed5182c2f9099ce1af9f984441a6" exitCode=0 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.920050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerDied","Data":"b9e01d6bf7484354224c17867982d582bfa0ed5182c2f9099ce1af9f984441a6"} Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.925173 4777 generic.go:334] "Generic (PLEG): container finished" podID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerID="fe789c04e2a9539ab8e6cc106f96e70ef884fca95069b6136608e6a240fc54e0" exitCode=0 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.925391 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerDied","Data":"fe789c04e2a9539ab8e6cc106f96e70ef884fca95069b6136608e6a240fc54e0"} Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.928364 4777 generic.go:334] "Generic (PLEG): container finished" podID="acf589f2-0447-4431-92cc-73956a345b44" containerID="261fac8202ca645dbfcd699349752872862546da3c33e57b7676ee1e201250d3" exitCode=0 Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.928549 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerDied","Data":"261fac8202ca645dbfcd699349752872862546da3c33e57b7676ee1e201250d3"} Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.928733 4777 scope.go:117] "RemoveContainer" containerID="d04996da245f1a942dff21e11727be6dd4105be864af59a55c7bc6ef30077764" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.930790 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.952843 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-config\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.952968 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38622dfb-ce12-4e38-a718-2b89e3dca594-serving-cert\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.953023 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlpw\" (UniqueName: \"kubernetes.io/projected/38622dfb-ce12-4e38-a718-2b89e3dca594-kube-api-access-mxlpw\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.953063 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-client-ca\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.954049 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-client-ca\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.955880 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38622dfb-ce12-4e38-a718-2b89e3dca594-config\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.961319 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.963865 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.965553 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38622dfb-ce12-4e38-a718-2b89e3dca594-serving-cert\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.968635 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cffcc444-7k8wj"] Feb 16 21:44:23 crc kubenswrapper[4777]: I0216 21:44:23.973314 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlpw\" (UniqueName: \"kubernetes.io/projected/38622dfb-ce12-4e38-a718-2b89e3dca594-kube-api-access-mxlpw\") pod \"route-controller-manager-64886dcbf7-bsrjr\" (UID: \"38622dfb-ce12-4e38-a718-2b89e3dca594\") " pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.053831 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lw8n\" (UniqueName: \"kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n\") pod \"93764402-52dc-48ce-9352-bd7982f7f0d6\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.053963 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities\") pod \"93764402-52dc-48ce-9352-bd7982f7f0d6\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.054058 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content\") pod \"93764402-52dc-48ce-9352-bd7982f7f0d6\" (UID: \"93764402-52dc-48ce-9352-bd7982f7f0d6\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.058188 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities" (OuterVolumeSpecName: "utilities") pod "93764402-52dc-48ce-9352-bd7982f7f0d6" (UID: "93764402-52dc-48ce-9352-bd7982f7f0d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.063149 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n" (OuterVolumeSpecName: "kube-api-access-5lw8n") pod "93764402-52dc-48ce-9352-bd7982f7f0d6" (UID: "93764402-52dc-48ce-9352-bd7982f7f0d6"). InnerVolumeSpecName "kube-api-access-5lw8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.124262 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93764402-52dc-48ce-9352-bd7982f7f0d6" (UID: "93764402-52dc-48ce-9352-bd7982f7f0d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.155793 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lw8n\" (UniqueName: \"kubernetes.io/projected/93764402-52dc-48ce-9352-bd7982f7f0d6-kube-api-access-5lw8n\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.155833 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.155844 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93764402-52dc-48ce-9352-bd7982f7f0d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.173957 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.193563 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598f437c-cb76-4c3a-88ef-459dbdc705b7" path="/var/lib/kubelet/pods/598f437c-cb76-4c3a-88ef-459dbdc705b7/volumes" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.205478 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.231495 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.240587 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.247052 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258757 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities\") pod \"f0761e7f-6d18-4832-be21-89d3415e7b21\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258811 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content\") pod \"f0761e7f-6d18-4832-be21-89d3415e7b21\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258840 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca\") pod \"acf589f2-0447-4431-92cc-73956a345b44\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258875 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities\") pod \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258902 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content\") pod \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258932 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics\") pod \"acf589f2-0447-4431-92cc-73956a345b44\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.258989 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25tx2\" (UniqueName: \"kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2\") pod \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\" (UID: \"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.259015 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw7zz\" (UniqueName: \"kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz\") pod \"acf589f2-0447-4431-92cc-73956a345b44\" (UID: \"acf589f2-0447-4431-92cc-73956a345b44\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.259032 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2bq5\" (UniqueName: \"kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5\") pod \"f0761e7f-6d18-4832-be21-89d3415e7b21\" (UID: \"f0761e7f-6d18-4832-be21-89d3415e7b21\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.260763 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities" (OuterVolumeSpecName: "utilities") pod "f0761e7f-6d18-4832-be21-89d3415e7b21" (UID: "f0761e7f-6d18-4832-be21-89d3415e7b21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.261232 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities" (OuterVolumeSpecName: "utilities") pod "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" (UID: "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.262082 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "acf589f2-0447-4431-92cc-73956a345b44" (UID: "acf589f2-0447-4431-92cc-73956a345b44"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.264355 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5" (OuterVolumeSpecName: "kube-api-access-q2bq5") pod "f0761e7f-6d18-4832-be21-89d3415e7b21" (UID: "f0761e7f-6d18-4832-be21-89d3415e7b21"). InnerVolumeSpecName "kube-api-access-q2bq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.264705 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "acf589f2-0447-4431-92cc-73956a345b44" (UID: "acf589f2-0447-4431-92cc-73956a345b44"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.265488 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2" (OuterVolumeSpecName: "kube-api-access-25tx2") pod "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" (UID: "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf"). InnerVolumeSpecName "kube-api-access-25tx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.277178 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz" (OuterVolumeSpecName: "kube-api-access-pw7zz") pod "acf589f2-0447-4431-92cc-73956a345b44" (UID: "acf589f2-0447-4431-92cc-73956a345b44"). InnerVolumeSpecName "kube-api-access-pw7zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.331917 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" (UID: "a6da8337-cf15-4ee9-a4f9-c7047aad3cdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.359918 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content\") pod \"01c46266-f08a-405d-8768-8075a15bb61d\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360061 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities\") pod \"01c46266-f08a-405d-8768-8075a15bb61d\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360091 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxjwv\" (UniqueName: \"kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv\") pod \"01c46266-f08a-405d-8768-8075a15bb61d\" (UID: \"01c46266-f08a-405d-8768-8075a15bb61d\") " Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360315 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360329 4777 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf589f2-0447-4431-92cc-73956a345b44-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360340 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360349 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360358 4777 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acf589f2-0447-4431-92cc-73956a345b44-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360368 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25tx2\" (UniqueName: \"kubernetes.io/projected/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf-kube-api-access-25tx2\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360378 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw7zz\" (UniqueName: \"kubernetes.io/projected/acf589f2-0447-4431-92cc-73956a345b44-kube-api-access-pw7zz\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.360386 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2bq5\" (UniqueName: \"kubernetes.io/projected/f0761e7f-6d18-4832-be21-89d3415e7b21-kube-api-access-q2bq5\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.361826 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities" (OuterVolumeSpecName: "utilities") pod "01c46266-f08a-405d-8768-8075a15bb61d" (UID: "01c46266-f08a-405d-8768-8075a15bb61d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.365407 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv" (OuterVolumeSpecName: "kube-api-access-dxjwv") pod "01c46266-f08a-405d-8768-8075a15bb61d" (UID: "01c46266-f08a-405d-8768-8075a15bb61d"). InnerVolumeSpecName "kube-api-access-dxjwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.384097 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01c46266-f08a-405d-8768-8075a15bb61d" (UID: "01c46266-f08a-405d-8768-8075a15bb61d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.412637 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-58xj7"] Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.414543 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0761e7f-6d18-4832-be21-89d3415e7b21" (UID: "f0761e7f-6d18-4832-be21-89d3415e7b21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:44:24 crc kubenswrapper[4777]: W0216 21:44:24.431199 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5014a8f_fe3e_404c_a85c_6062fd0e76f7.slice/crio-10f73eb8ca0bfe5cc24028ddbbeab19ae299c6e909215f1d0cfe35987f87896c WatchSource:0}: Error finding container 10f73eb8ca0bfe5cc24028ddbbeab19ae299c6e909215f1d0cfe35987f87896c: Status 404 returned error can't find the container with id 10f73eb8ca0bfe5cc24028ddbbeab19ae299c6e909215f1d0cfe35987f87896c Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.460899 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.460926 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxjwv\" (UniqueName: \"kubernetes.io/projected/01c46266-f08a-405d-8768-8075a15bb61d-kube-api-access-dxjwv\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.460940 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0761e7f-6d18-4832-be21-89d3415e7b21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.460948 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01c46266-f08a-405d-8768-8075a15bb61d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.630593 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr"] Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.938953 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" event={"ID":"f5014a8f-fe3e-404c-a85c-6062fd0e76f7","Type":"ContainerStarted","Data":"6dffc46fb5350fa455f83c9eed9fc71c64a58057da804bdaf05b483a1d3122ca"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.939438 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" event={"ID":"f5014a8f-fe3e-404c-a85c-6062fd0e76f7","Type":"ContainerStarted","Data":"10f73eb8ca0bfe5cc24028ddbbeab19ae299c6e909215f1d0cfe35987f87896c"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.939465 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.942022 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b76h6" event={"ID":"f0761e7f-6d18-4832-be21-89d3415e7b21","Type":"ContainerDied","Data":"7ffd6aecd1e19ec403c8d9a24358d3706a9d0705116aa74bb197776a88d3ff71"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.942068 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b76h6" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.942080 4777 scope.go:117] "RemoveContainer" containerID="63ba03eead54b2094492589d7645f4e4f852c1b89093cf0c4b209fd11e63cdc9" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.943626 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.944015 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" event={"ID":"38622dfb-ce12-4e38-a718-2b89e3dca594","Type":"ContainerStarted","Data":"1d56c6d2e90acc45607e65363b99544e0e09c8b7a548d8d16637760ac8324e7b"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.944051 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" event={"ID":"38622dfb-ce12-4e38-a718-2b89e3dca594","Type":"ContainerStarted","Data":"5af94641f6b3cf065bd9d0ea3f0ca35c01f77d4c1d610e88df88299c3481edd5"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.944218 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.948410 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nrsdv" event={"ID":"93764402-52dc-48ce-9352-bd7982f7f0d6","Type":"ContainerDied","Data":"9f4b9ce61816a48a4a2c6e7374a733caf05a9861f6e5da740c10f5820a2840ac"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.948587 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nrsdv" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.955399 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k68t2" event={"ID":"01c46266-f08a-405d-8768-8075a15bb61d","Type":"ContainerDied","Data":"c591359e258c564ecbce36f8db609ee9fde356b998ccd3d5767b222c8d6633fc"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.955507 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k68t2" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.961172 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzxjq" event={"ID":"a6da8337-cf15-4ee9-a4f9-c7047aad3cdf","Type":"ContainerDied","Data":"8ded80d680336ca291a68e69af2b1599d779faeb1610c0fa656b51e88f640f91"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.961514 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzxjq" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.964287 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" event={"ID":"acf589f2-0447-4431-92cc-73956a345b44","Type":"ContainerDied","Data":"5dba7c803ea67153cc734e188a62ee961d392e5c18f053cb2001a65a8ff98c2b"} Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.964349 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfdd6" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.964678 4777 scope.go:117] "RemoveContainer" containerID="ca18a989d5d75588fb4c909d29d32f479c56fa5a9a50ae5c903273f12d93c3a1" Feb 16 21:44:24 crc kubenswrapper[4777]: I0216 21:44:24.987156 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-58xj7" podStartSLOduration=1.987110674 podStartE2EDuration="1.987110674s" podCreationTimestamp="2026-02-16 21:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:24.967917817 +0000 UTC m=+385.550418919" watchObservedRunningTime="2026-02-16 21:44:24.987110674 +0000 UTC m=+385.569611816" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.003131 4777 scope.go:117] "RemoveContainer" containerID="bdbb6a62c747450ddad161ecb3243d32ca48d2bd1e017dc59cbbcb6df8da4280" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.017256 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" podStartSLOduration=3.017222616 podStartE2EDuration="3.017222616s" podCreationTimestamp="2026-02-16 21:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:24.997572256 +0000 UTC m=+385.580073368" watchObservedRunningTime="2026-02-16 21:44:25.017222616 +0000 UTC m=+385.599723728" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.025985 4777 scope.go:117] "RemoveContainer" containerID="629b899e1d5dc8bcb2a60f4b27ef8e0f5a975fd73c9ce40270f2f460998b0e8c" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.049230 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.055156 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b76h6"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.068092 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.081538 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nrsdv"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.087228 4777 scope.go:117] "RemoveContainer" containerID="a8f92bfa7ebe9c97001da2914e8e3b3124a6794ccbfc182cb5fc12a753c33e1b" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.098896 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.102617 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wzxjq"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.104204 4777 scope.go:117] "RemoveContainer" containerID="b56598416149d2334df348d32552f0140ca772c58d9ab50ee4d8ff5311eb3f84" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.116223 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.122732 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k68t2"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.132460 4777 scope.go:117] "RemoveContainer" containerID="b9e01d6bf7484354224c17867982d582bfa0ed5182c2f9099ce1af9f984441a6" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.132591 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.134877 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfdd6"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.152246 4777 scope.go:117] "RemoveContainer" containerID="7765449e873a85b6c1547f45164cc9a3b1826e2d29f00c2f66efb31607476b4c" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.172194 4777 scope.go:117] "RemoveContainer" containerID="57fa2c12ec0b92da7d1de7cfe3dd24ffb17e03ce32e657f18e50f86a5ae937ca" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.189825 4777 scope.go:117] "RemoveContainer" containerID="fe789c04e2a9539ab8e6cc106f96e70ef884fca95069b6136608e6a240fc54e0" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.204586 4777 scope.go:117] "RemoveContainer" containerID="bac295f9485b020674522d695d0fe63dd2cbe3bf620785ea9efd01e6d17c1e21" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.220144 4777 scope.go:117] "RemoveContainer" containerID="d96bf99acf21c9d55f56f625c5a9b34a6b9e35ffd110289f202824088fbf59c2" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.238694 4777 scope.go:117] "RemoveContainer" containerID="261fac8202ca645dbfcd699349752872862546da3c33e57b7676ee1e201250d3" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.299776 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64886dcbf7-bsrjr" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647243 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-52t67"] Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647876 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647897 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647910 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647920 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647938 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647947 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647956 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647966 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647976 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.647985 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.647998 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648007 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648018 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648027 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648039 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648048 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648069 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648078 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="extract-utilities" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648089 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648099 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648111 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648119 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648131 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648140 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648154 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648164 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: E0216 21:44:25.648176 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648184 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="extract-content" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648330 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648350 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648360 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c46266-f08a-405d-8768-8075a15bb61d" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648373 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648389 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" containerName="registry-server" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.648649 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf589f2-0447-4431-92cc-73956a345b44" containerName="marketplace-operator" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.649625 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.655348 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52t67"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.662636 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.677741 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-catalog-content\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.677794 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62r5p\" (UniqueName: \"kubernetes.io/projected/c92089ab-26b0-4604-a957-9e37cc949736-kube-api-access-62r5p\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.677844 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-utilities\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.779266 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62r5p\" (UniqueName: \"kubernetes.io/projected/c92089ab-26b0-4604-a957-9e37cc949736-kube-api-access-62r5p\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.779678 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-utilities\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.779908 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-catalog-content\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.780390 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-utilities\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.780444 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92089ab-26b0-4604-a957-9e37cc949736-catalog-content\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.822414 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62r5p\" (UniqueName: \"kubernetes.io/projected/c92089ab-26b0-4604-a957-9e37cc949736-kube-api-access-62r5p\") pod \"redhat-marketplace-52t67\" (UID: \"c92089ab-26b0-4604-a957-9e37cc949736\") " pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.825242 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-njbxw"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.827264 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.830621 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.836085 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njbxw"] Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.881392 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-catalog-content\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.881503 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-utilities\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.882062 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbkv\" (UniqueName: \"kubernetes.io/projected/00b2161c-e508-4ce9-bcdd-185a02223d1c-kube-api-access-fwbkv\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.982944 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbkv\" (UniqueName: \"kubernetes.io/projected/00b2161c-e508-4ce9-bcdd-185a02223d1c-kube-api-access-fwbkv\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.983037 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-catalog-content\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.983099 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-utilities\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.984026 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-utilities\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.984219 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:25 crc kubenswrapper[4777]: I0216 21:44:25.984215 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b2161c-e508-4ce9-bcdd-185a02223d1c-catalog-content\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.002406 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbkv\" (UniqueName: \"kubernetes.io/projected/00b2161c-e508-4ce9-bcdd-185a02223d1c-kube-api-access-fwbkv\") pod \"redhat-operators-njbxw\" (UID: \"00b2161c-e508-4ce9-bcdd-185a02223d1c\") " pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.156772 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.192601 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c46266-f08a-405d-8768-8075a15bb61d" path="/var/lib/kubelet/pods/01c46266-f08a-405d-8768-8075a15bb61d/volumes" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.193442 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93764402-52dc-48ce-9352-bd7982f7f0d6" path="/var/lib/kubelet/pods/93764402-52dc-48ce-9352-bd7982f7f0d6/volumes" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.194697 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6da8337-cf15-4ee9-a4f9-c7047aad3cdf" path="/var/lib/kubelet/pods/a6da8337-cf15-4ee9-a4f9-c7047aad3cdf/volumes" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.196698 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf589f2-0447-4431-92cc-73956a345b44" path="/var/lib/kubelet/pods/acf589f2-0447-4431-92cc-73956a345b44/volumes" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.198076 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0761e7f-6d18-4832-be21-89d3415e7b21" path="/var/lib/kubelet/pods/f0761e7f-6d18-4832-be21-89d3415e7b21/volumes" Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.420543 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-52t67"] Feb 16 21:44:26 crc kubenswrapper[4777]: W0216 21:44:26.421441 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92089ab_26b0_4604_a957_9e37cc949736.slice/crio-431cdb250c7ae89bf886ed9c83a3f0876edde24532d26a315ddadaaeb3b3fa9a WatchSource:0}: Error finding container 431cdb250c7ae89bf886ed9c83a3f0876edde24532d26a315ddadaaeb3b3fa9a: Status 404 returned error can't find the container with id 431cdb250c7ae89bf886ed9c83a3f0876edde24532d26a315ddadaaeb3b3fa9a Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.600987 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njbxw"] Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.989777 4777 generic.go:334] "Generic (PLEG): container finished" podID="c92089ab-26b0-4604-a957-9e37cc949736" containerID="860853c91c9ef0b7b8a4e9e49eaf63fd8b2a5c2a37f68abf10cfe1fc3f7a4371" exitCode=0 Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.989871 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52t67" event={"ID":"c92089ab-26b0-4604-a957-9e37cc949736","Type":"ContainerDied","Data":"860853c91c9ef0b7b8a4e9e49eaf63fd8b2a5c2a37f68abf10cfe1fc3f7a4371"} Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.989906 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52t67" event={"ID":"c92089ab-26b0-4604-a957-9e37cc949736","Type":"ContainerStarted","Data":"431cdb250c7ae89bf886ed9c83a3f0876edde24532d26a315ddadaaeb3b3fa9a"} Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.992597 4777 generic.go:334] "Generic (PLEG): container finished" podID="00b2161c-e508-4ce9-bcdd-185a02223d1c" containerID="eb89b359273751c32a266c3733532f785897dfdcd6936bcc88dec5e0d1923705" exitCode=0 Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.992659 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njbxw" event={"ID":"00b2161c-e508-4ce9-bcdd-185a02223d1c","Type":"ContainerDied","Data":"eb89b359273751c32a266c3733532f785897dfdcd6936bcc88dec5e0d1923705"} Feb 16 21:44:26 crc kubenswrapper[4777]: I0216 21:44:26.992704 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njbxw" event={"ID":"00b2161c-e508-4ce9-bcdd-185a02223d1c","Type":"ContainerStarted","Data":"065450cf394fac8c4016b21acc06cf7cb303282a0451eaa239aa4fef12c35db5"} Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.000423 4777 generic.go:334] "Generic (PLEG): container finished" podID="c92089ab-26b0-4604-a957-9e37cc949736" containerID="ba01aa7109c35d63682df21f9c1719b7629efa55a846518f367b1034ec85ab15" exitCode=0 Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.000516 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52t67" event={"ID":"c92089ab-26b0-4604-a957-9e37cc949736","Type":"ContainerDied","Data":"ba01aa7109c35d63682df21f9c1719b7629efa55a846518f367b1034ec85ab15"} Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.004654 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njbxw" event={"ID":"00b2161c-e508-4ce9-bcdd-185a02223d1c","Type":"ContainerStarted","Data":"1c56de122fde67d6e86b84315df79d907e1decd1838fd8f9407566aa13d525c7"} Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.023341 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w66m"] Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.024357 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.029752 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.044861 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w66m"] Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.106426 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjsp\" (UniqueName: \"kubernetes.io/projected/730c4cf5-a183-42de-be25-3521a01e5905-kube-api-access-lrjsp\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.106478 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-catalog-content\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.106562 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-utilities\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.207771 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-utilities\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.207846 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjsp\" (UniqueName: \"kubernetes.io/projected/730c4cf5-a183-42de-be25-3521a01e5905-kube-api-access-lrjsp\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.207867 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-catalog-content\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.208888 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-catalog-content\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.209263 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/730c4cf5-a183-42de-be25-3521a01e5905-utilities\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.222959 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m65p9"] Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.223915 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.226250 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.245968 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m65p9"] Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.256006 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjsp\" (UniqueName: \"kubernetes.io/projected/730c4cf5-a183-42de-be25-3521a01e5905-kube-api-access-lrjsp\") pod \"community-operators-9w66m\" (UID: \"730c4cf5-a183-42de-be25-3521a01e5905\") " pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.308609 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-utilities\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.309059 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-catalog-content\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.309119 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqrr\" (UniqueName: \"kubernetes.io/projected/ed77d1bb-44c6-4674-b17f-7c451302773a-kube-api-access-bjqrr\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.350250 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.409651 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-catalog-content\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.409767 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqrr\" (UniqueName: \"kubernetes.io/projected/ed77d1bb-44c6-4674-b17f-7c451302773a-kube-api-access-bjqrr\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.409857 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-utilities\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.410578 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-catalog-content\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.410663 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed77d1bb-44c6-4674-b17f-7c451302773a-utilities\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.438700 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqrr\" (UniqueName: \"kubernetes.io/projected/ed77d1bb-44c6-4674-b17f-7c451302773a-kube-api-access-bjqrr\") pod \"certified-operators-m65p9\" (UID: \"ed77d1bb-44c6-4674-b17f-7c451302773a\") " pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.572858 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:28 crc kubenswrapper[4777]: I0216 21:44:28.773008 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w66m"] Feb 16 21:44:28 crc kubenswrapper[4777]: W0216 21:44:28.790788 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730c4cf5_a183_42de_be25_3521a01e5905.slice/crio-7d2109af5b605e0f52745d798b921cf12c238b19a27e3c4c6a60abe949c5a06e WatchSource:0}: Error finding container 7d2109af5b605e0f52745d798b921cf12c238b19a27e3c4c6a60abe949c5a06e: Status 404 returned error can't find the container with id 7d2109af5b605e0f52745d798b921cf12c238b19a27e3c4c6a60abe949c5a06e Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.014158 4777 generic.go:334] "Generic (PLEG): container finished" podID="00b2161c-e508-4ce9-bcdd-185a02223d1c" containerID="1c56de122fde67d6e86b84315df79d907e1decd1838fd8f9407566aa13d525c7" exitCode=0 Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.014239 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njbxw" event={"ID":"00b2161c-e508-4ce9-bcdd-185a02223d1c","Type":"ContainerDied","Data":"1c56de122fde67d6e86b84315df79d907e1decd1838fd8f9407566aa13d525c7"} Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.022178 4777 generic.go:334] "Generic (PLEG): container finished" podID="730c4cf5-a183-42de-be25-3521a01e5905" containerID="8b780837157d0d2f09645e1660bae9816770c0f90d213a3f7017f5c0208fd6e8" exitCode=0 Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.022300 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w66m" event={"ID":"730c4cf5-a183-42de-be25-3521a01e5905","Type":"ContainerDied","Data":"8b780837157d0d2f09645e1660bae9816770c0f90d213a3f7017f5c0208fd6e8"} Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.022388 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w66m" event={"ID":"730c4cf5-a183-42de-be25-3521a01e5905","Type":"ContainerStarted","Data":"7d2109af5b605e0f52745d798b921cf12c238b19a27e3c4c6a60abe949c5a06e"} Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.026669 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m65p9"] Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.032787 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-52t67" event={"ID":"c92089ab-26b0-4604-a957-9e37cc949736","Type":"ContainerStarted","Data":"857bad96cfc621ecc303ba7a6828a857a3307394c6b22e02b9165a3dcb16e17a"} Feb 16 21:44:29 crc kubenswrapper[4777]: W0216 21:44:29.043575 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded77d1bb_44c6_4674_b17f_7c451302773a.slice/crio-825a760826a799bb1fdc021d95da490b7bd307f107d17d0f37793c4d58516c93 WatchSource:0}: Error finding container 825a760826a799bb1fdc021d95da490b7bd307f107d17d0f37793c4d58516c93: Status 404 returned error can't find the container with id 825a760826a799bb1fdc021d95da490b7bd307f107d17d0f37793c4d58516c93 Feb 16 21:44:29 crc kubenswrapper[4777]: I0216 21:44:29.064405 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-52t67" podStartSLOduration=2.6235234739999997 podStartE2EDuration="4.064387548s" podCreationTimestamp="2026-02-16 21:44:25 +0000 UTC" firstStartedPulling="2026-02-16 21:44:26.99153994 +0000 UTC m=+387.574041052" lastFinishedPulling="2026-02-16 21:44:28.432403994 +0000 UTC m=+389.014905126" observedRunningTime="2026-02-16 21:44:29.062128585 +0000 UTC m=+389.644629687" watchObservedRunningTime="2026-02-16 21:44:29.064387548 +0000 UTC m=+389.646888650" Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.045149 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njbxw" event={"ID":"00b2161c-e508-4ce9-bcdd-185a02223d1c","Type":"ContainerStarted","Data":"78788d5bc5f5d57ae3a173611d17f7185b216eabca7455b5da97079fb1ac8193"} Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.048781 4777 generic.go:334] "Generic (PLEG): container finished" podID="ed77d1bb-44c6-4674-b17f-7c451302773a" containerID="7dd95b7f510ab340e5beef3cf3f0b016773ad4d3f803e1508fcc2755d470d4c5" exitCode=0 Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.048857 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m65p9" event={"ID":"ed77d1bb-44c6-4674-b17f-7c451302773a","Type":"ContainerDied","Data":"7dd95b7f510ab340e5beef3cf3f0b016773ad4d3f803e1508fcc2755d470d4c5"} Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.048894 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m65p9" event={"ID":"ed77d1bb-44c6-4674-b17f-7c451302773a","Type":"ContainerStarted","Data":"825a760826a799bb1fdc021d95da490b7bd307f107d17d0f37793c4d58516c93"} Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.052341 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w66m" event={"ID":"730c4cf5-a183-42de-be25-3521a01e5905","Type":"ContainerStarted","Data":"f932d133e89ec1a17e7e8c4fbe870030c9e454a0861fd1145b72ce115452d4c9"} Feb 16 21:44:30 crc kubenswrapper[4777]: I0216 21:44:30.065114 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-njbxw" podStartSLOduration=2.624779949 podStartE2EDuration="5.065090964s" podCreationTimestamp="2026-02-16 21:44:25 +0000 UTC" firstStartedPulling="2026-02-16 21:44:26.995243283 +0000 UTC m=+387.577744395" lastFinishedPulling="2026-02-16 21:44:29.435554308 +0000 UTC m=+390.018055410" observedRunningTime="2026-02-16 21:44:30.06425236 +0000 UTC m=+390.646753472" watchObservedRunningTime="2026-02-16 21:44:30.065090964 +0000 UTC m=+390.647592066" Feb 16 21:44:31 crc kubenswrapper[4777]: I0216 21:44:31.060674 4777 generic.go:334] "Generic (PLEG): container finished" podID="730c4cf5-a183-42de-be25-3521a01e5905" containerID="f932d133e89ec1a17e7e8c4fbe870030c9e454a0861fd1145b72ce115452d4c9" exitCode=0 Feb 16 21:44:31 crc kubenswrapper[4777]: I0216 21:44:31.062140 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w66m" event={"ID":"730c4cf5-a183-42de-be25-3521a01e5905","Type":"ContainerDied","Data":"f932d133e89ec1a17e7e8c4fbe870030c9e454a0861fd1145b72ce115452d4c9"} Feb 16 21:44:31 crc kubenswrapper[4777]: I0216 21:44:31.062170 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w66m" event={"ID":"730c4cf5-a183-42de-be25-3521a01e5905","Type":"ContainerStarted","Data":"aab163a03d7b0e87b7ec6f73493f680cc43a17c797e9f0c08b3d3eea6d0daebe"} Feb 16 21:44:31 crc kubenswrapper[4777]: I0216 21:44:31.067006 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m65p9" event={"ID":"ed77d1bb-44c6-4674-b17f-7c451302773a","Type":"ContainerStarted","Data":"7635c7c30254f044a4a3947d618b811c2b5fa26503e9a06bd08a7e82ab5b70c9"} Feb 16 21:44:31 crc kubenswrapper[4777]: I0216 21:44:31.084330 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w66m" podStartSLOduration=1.557806831 podStartE2EDuration="3.084310871s" podCreationTimestamp="2026-02-16 21:44:28 +0000 UTC" firstStartedPulling="2026-02-16 21:44:29.031986502 +0000 UTC m=+389.614487614" lastFinishedPulling="2026-02-16 21:44:30.558490552 +0000 UTC m=+391.140991654" observedRunningTime="2026-02-16 21:44:31.081202003 +0000 UTC m=+391.663703105" watchObservedRunningTime="2026-02-16 21:44:31.084310871 +0000 UTC m=+391.666811963" Feb 16 21:44:32 crc kubenswrapper[4777]: I0216 21:44:32.091126 4777 generic.go:334] "Generic (PLEG): container finished" podID="ed77d1bb-44c6-4674-b17f-7c451302773a" containerID="7635c7c30254f044a4a3947d618b811c2b5fa26503e9a06bd08a7e82ab5b70c9" exitCode=0 Feb 16 21:44:32 crc kubenswrapper[4777]: I0216 21:44:32.093395 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m65p9" event={"ID":"ed77d1bb-44c6-4674-b17f-7c451302773a","Type":"ContainerDied","Data":"7635c7c30254f044a4a3947d618b811c2b5fa26503e9a06bd08a7e82ab5b70c9"} Feb 16 21:44:33 crc kubenswrapper[4777]: I0216 21:44:33.101781 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m65p9" event={"ID":"ed77d1bb-44c6-4674-b17f-7c451302773a","Type":"ContainerStarted","Data":"a250fdd5ae959ab8812939e9693871ba5f2493f50c639eaa60d8846ed353a8e1"} Feb 16 21:44:33 crc kubenswrapper[4777]: I0216 21:44:33.122335 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m65p9" podStartSLOduration=2.694310009 podStartE2EDuration="5.12230169s" podCreationTimestamp="2026-02-16 21:44:28 +0000 UTC" firstStartedPulling="2026-02-16 21:44:30.05067041 +0000 UTC m=+390.633171512" lastFinishedPulling="2026-02-16 21:44:32.478662081 +0000 UTC m=+393.061163193" observedRunningTime="2026-02-16 21:44:33.117121229 +0000 UTC m=+393.699622321" watchObservedRunningTime="2026-02-16 21:44:33.12230169 +0000 UTC m=+393.704802792" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.784393 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62hsx"] Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.785528 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.799074 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62hsx"] Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955147 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-certificates\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955223 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-bound-sa-token\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955267 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsn2\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-kube-api-access-6nsn2\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955297 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86915ba-9e6b-4f54-b26f-af1912a97f4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955319 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-tls\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955354 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86915ba-9e6b-4f54-b26f-af1912a97f4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955402 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.955426 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-trusted-ca\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:34 crc kubenswrapper[4777]: I0216 21:44:34.984447 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057059 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsn2\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-kube-api-access-6nsn2\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057122 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86915ba-9e6b-4f54-b26f-af1912a97f4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057151 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-tls\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057190 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86915ba-9e6b-4f54-b26f-af1912a97f4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057233 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-trusted-ca\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057275 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-certificates\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.057306 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-bound-sa-token\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.058511 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86915ba-9e6b-4f54-b26f-af1912a97f4d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.060630 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-trusted-ca\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.061408 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-certificates\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.064828 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-registry-tls\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.067340 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86915ba-9e6b-4f54-b26f-af1912a97f4d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.074602 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-bound-sa-token\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.081071 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsn2\" (UniqueName: \"kubernetes.io/projected/c86915ba-9e6b-4f54-b26f-af1912a97f4d-kube-api-access-6nsn2\") pod \"image-registry-66df7c8f76-62hsx\" (UID: \"c86915ba-9e6b-4f54-b26f-af1912a97f4d\") " pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.170833 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.583856 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62hsx"] Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.985314 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:35 crc kubenswrapper[4777]: I0216 21:44:35.985507 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.027038 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.129212 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" event={"ID":"c86915ba-9e6b-4f54-b26f-af1912a97f4d","Type":"ContainerStarted","Data":"6156e41ac4a7661cf5a2bb126fb15c9451fe38ca16b6d85224d236d8f18f1af1"} Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.129302 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" event={"ID":"c86915ba-9e6b-4f54-b26f-af1912a97f4d","Type":"ContainerStarted","Data":"94bae5323888a0b72c618be086b1df95e9b5230233baec8323e8568a0b741c2c"} Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.157761 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.157818 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.165004 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" podStartSLOduration=2.164976519 podStartE2EDuration="2.164976519s" podCreationTimestamp="2026-02-16 21:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:36.157370107 +0000 UTC m=+396.739871249" watchObservedRunningTime="2026-02-16 21:44:36.164976519 +0000 UTC m=+396.747477631" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.202566 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-52t67" Feb 16 21:44:36 crc kubenswrapper[4777]: I0216 21:44:36.205190 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:37 crc kubenswrapper[4777]: I0216 21:44:37.135230 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:37 crc kubenswrapper[4777]: I0216 21:44:37.190915 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-njbxw" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.351321 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.351410 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.426386 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.573390 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.573491 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:38 crc kubenswrapper[4777]: I0216 21:44:38.621852 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:39 crc kubenswrapper[4777]: I0216 21:44:39.192555 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w66m" Feb 16 21:44:39 crc kubenswrapper[4777]: I0216 21:44:39.196810 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m65p9" Feb 16 21:44:41 crc kubenswrapper[4777]: I0216 21:44:41.651652 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:44:41 crc kubenswrapper[4777]: I0216 21:44:41.652079 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:44:41 crc kubenswrapper[4777]: I0216 21:44:41.652133 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:44:41 crc kubenswrapper[4777]: I0216 21:44:41.652691 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:44:41 crc kubenswrapper[4777]: I0216 21:44:41.652763 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9" gracePeriod=600 Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.167538 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9" exitCode=0 Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.167625 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9"} Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.168215 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7"} Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.168341 4777 scope.go:117] "RemoveContainer" containerID="1bf772e4f1d26988360c258466dc9932d60c65e13873bdb0072e154b82f64cf9" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.443913 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.444527 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" podUID="da461874-eb45-43eb-8878-1647b74378c3" containerName="controller-manager" containerID="cri-o://010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64" gracePeriod=30 Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.933882 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.975480 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca\") pod \"da461874-eb45-43eb-8878-1647b74378c3\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976044 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht2ds\" (UniqueName: \"kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds\") pod \"da461874-eb45-43eb-8878-1647b74378c3\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976143 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config\") pod \"da461874-eb45-43eb-8878-1647b74378c3\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976184 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert\") pod \"da461874-eb45-43eb-8878-1647b74378c3\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976202 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles\") pod \"da461874-eb45-43eb-8878-1647b74378c3\" (UID: \"da461874-eb45-43eb-8878-1647b74378c3\") " Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976470 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "da461874-eb45-43eb-8878-1647b74378c3" (UID: "da461874-eb45-43eb-8878-1647b74378c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976788 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config" (OuterVolumeSpecName: "config") pod "da461874-eb45-43eb-8878-1647b74378c3" (UID: "da461874-eb45-43eb-8878-1647b74378c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976816 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "da461874-eb45-43eb-8878-1647b74378c3" (UID: "da461874-eb45-43eb-8878-1647b74378c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.976837 4777 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.987436 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds" (OuterVolumeSpecName: "kube-api-access-ht2ds") pod "da461874-eb45-43eb-8878-1647b74378c3" (UID: "da461874-eb45-43eb-8878-1647b74378c3"). InnerVolumeSpecName "kube-api-access-ht2ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:44:42 crc kubenswrapper[4777]: I0216 21:44:42.994501 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da461874-eb45-43eb-8878-1647b74378c3" (UID: "da461874-eb45-43eb-8878-1647b74378c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.078373 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.078408 4777 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da461874-eb45-43eb-8878-1647b74378c3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.078420 4777 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da461874-eb45-43eb-8878-1647b74378c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.078433 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht2ds\" (UniqueName: \"kubernetes.io/projected/da461874-eb45-43eb-8878-1647b74378c3-kube-api-access-ht2ds\") on node \"crc\" DevicePath \"\"" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.175939 4777 generic.go:334] "Generic (PLEG): container finished" podID="da461874-eb45-43eb-8878-1647b74378c3" containerID="010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64" exitCode=0 Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.175988 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" event={"ID":"da461874-eb45-43eb-8878-1647b74378c3","Type":"ContainerDied","Data":"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64"} Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.176018 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" event={"ID":"da461874-eb45-43eb-8878-1647b74378c3","Type":"ContainerDied","Data":"c298a38be64ba244f77e3ee227fdf7b02ff9cd940ca7522a5a67500a6980afe5"} Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.176037 4777 scope.go:117] "RemoveContainer" containerID="010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.176128 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6889d7b855-4g2ht" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.192175 4777 scope.go:117] "RemoveContainer" containerID="010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64" Feb 16 21:44:43 crc kubenswrapper[4777]: E0216 21:44:43.192734 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64\": container with ID starting with 010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64 not found: ID does not exist" containerID="010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.192776 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64"} err="failed to get container status \"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64\": rpc error: code = NotFound desc = could not find container \"010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64\": container with ID starting with 010662f48239eb6b3c6812966b3ec4ed80d68408c3eee9a7e23d7d50f17f5e64 not found: ID does not exist" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.214647 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.218729 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6889d7b855-4g2ht"] Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.742315 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7"] Feb 16 21:44:43 crc kubenswrapper[4777]: E0216 21:44:43.742530 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da461874-eb45-43eb-8878-1647b74378c3" containerName="controller-manager" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.742542 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="da461874-eb45-43eb-8878-1647b74378c3" containerName="controller-manager" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.742628 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="da461874-eb45-43eb-8878-1647b74378c3" containerName="controller-manager" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.743039 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.745129 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.746343 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.746546 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.747638 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.748702 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.750286 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.754055 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.778293 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7"] Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.787190 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx9w9\" (UniqueName: \"kubernetes.io/projected/aaccba9a-9b68-4326-93c2-00f491eba818-kube-api-access-gx9w9\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.787252 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-client-ca\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.787281 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-config\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.787536 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaccba9a-9b68-4326-93c2-00f491eba818-serving-cert\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.787641 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.889254 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx9w9\" (UniqueName: \"kubernetes.io/projected/aaccba9a-9b68-4326-93c2-00f491eba818-kube-api-access-gx9w9\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.889317 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-client-ca\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.889351 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-config\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.889409 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaccba9a-9b68-4326-93c2-00f491eba818-serving-cert\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.889454 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.890567 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-client-ca\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.891013 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-proxy-ca-bundles\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.891037 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaccba9a-9b68-4326-93c2-00f491eba818-config\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.898563 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaccba9a-9b68-4326-93c2-00f491eba818-serving-cert\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:43 crc kubenswrapper[4777]: I0216 21:44:43.909773 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx9w9\" (UniqueName: \"kubernetes.io/projected/aaccba9a-9b68-4326-93c2-00f491eba818-kube-api-access-gx9w9\") pod \"controller-manager-5877bb7bbb-nc5t7\" (UID: \"aaccba9a-9b68-4326-93c2-00f491eba818\") " pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:44 crc kubenswrapper[4777]: I0216 21:44:44.058983 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:44 crc kubenswrapper[4777]: I0216 21:44:44.188668 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da461874-eb45-43eb-8878-1647b74378c3" path="/var/lib/kubelet/pods/da461874-eb45-43eb-8878-1647b74378c3/volumes" Feb 16 21:44:44 crc kubenswrapper[4777]: I0216 21:44:44.291560 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7"] Feb 16 21:44:44 crc kubenswrapper[4777]: W0216 21:44:44.303115 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaccba9a_9b68_4326_93c2_00f491eba818.slice/crio-1b2a4b7086bfabb40eae2703670ded6bbc65729e2cd371fa2cf80c058fe5336a WatchSource:0}: Error finding container 1b2a4b7086bfabb40eae2703670ded6bbc65729e2cd371fa2cf80c058fe5336a: Status 404 returned error can't find the container with id 1b2a4b7086bfabb40eae2703670ded6bbc65729e2cd371fa2cf80c058fe5336a Feb 16 21:44:45 crc kubenswrapper[4777]: I0216 21:44:45.191795 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" event={"ID":"aaccba9a-9b68-4326-93c2-00f491eba818","Type":"ContainerStarted","Data":"91a02e1b4ae3544365617cfc686d3aba5743321bf4d41da701d6dc1ae52df71f"} Feb 16 21:44:45 crc kubenswrapper[4777]: I0216 21:44:45.192193 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:45 crc kubenswrapper[4777]: I0216 21:44:45.192215 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" event={"ID":"aaccba9a-9b68-4326-93c2-00f491eba818","Type":"ContainerStarted","Data":"1b2a4b7086bfabb40eae2703670ded6bbc65729e2cd371fa2cf80c058fe5336a"} Feb 16 21:44:45 crc kubenswrapper[4777]: I0216 21:44:45.196539 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" Feb 16 21:44:45 crc kubenswrapper[4777]: I0216 21:44:45.217220 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5877bb7bbb-nc5t7" podStartSLOduration=3.21720131 podStartE2EDuration="3.21720131s" podCreationTimestamp="2026-02-16 21:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:44:45.215670012 +0000 UTC m=+405.798171124" watchObservedRunningTime="2026-02-16 21:44:45.21720131 +0000 UTC m=+405.799702412" Feb 16 21:44:55 crc kubenswrapper[4777]: I0216 21:44:55.181419 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-62hsx" Feb 16 21:44:55 crc kubenswrapper[4777]: I0216 21:44:55.297021 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.209690 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt"] Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.211221 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.214061 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.214193 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.218118 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt"] Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.332517 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.332797 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.332866 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphf6\" (UniqueName: \"kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.433793 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.433876 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphf6\" (UniqueName: \"kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.433965 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.436285 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.444559 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.457212 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphf6\" (UniqueName: \"kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6\") pod \"collect-profiles-29521305-cwvkt\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.537423 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:00 crc kubenswrapper[4777]: I0216 21:45:00.952481 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt"] Feb 16 21:45:01 crc kubenswrapper[4777]: I0216 21:45:01.303158 4777 generic.go:334] "Generic (PLEG): container finished" podID="c81e09cd-794a-4752-afa0-f151859cfdd6" containerID="b3ec78363415f9ff565ab426256289768308a677b858101872808bbd73b8c28a" exitCode=0 Feb 16 21:45:01 crc kubenswrapper[4777]: I0216 21:45:01.303271 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" event={"ID":"c81e09cd-794a-4752-afa0-f151859cfdd6","Type":"ContainerDied","Data":"b3ec78363415f9ff565ab426256289768308a677b858101872808bbd73b8c28a"} Feb 16 21:45:01 crc kubenswrapper[4777]: I0216 21:45:01.303355 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" event={"ID":"c81e09cd-794a-4752-afa0-f151859cfdd6","Type":"ContainerStarted","Data":"150ec54b574a50f99f94405717879c46a348099eb28ae81c3820149e57a8b47c"} Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.703800 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.871616 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wphf6\" (UniqueName: \"kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6\") pod \"c81e09cd-794a-4752-afa0-f151859cfdd6\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.871706 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume\") pod \"c81e09cd-794a-4752-afa0-f151859cfdd6\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.871855 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume\") pod \"c81e09cd-794a-4752-afa0-f151859cfdd6\" (UID: \"c81e09cd-794a-4752-afa0-f151859cfdd6\") " Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.873081 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume" (OuterVolumeSpecName: "config-volume") pod "c81e09cd-794a-4752-afa0-f151859cfdd6" (UID: "c81e09cd-794a-4752-afa0-f151859cfdd6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.878911 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c81e09cd-794a-4752-afa0-f151859cfdd6" (UID: "c81e09cd-794a-4752-afa0-f151859cfdd6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.879966 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6" (OuterVolumeSpecName: "kube-api-access-wphf6") pod "c81e09cd-794a-4752-afa0-f151859cfdd6" (UID: "c81e09cd-794a-4752-afa0-f151859cfdd6"). InnerVolumeSpecName "kube-api-access-wphf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.973207 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wphf6\" (UniqueName: \"kubernetes.io/projected/c81e09cd-794a-4752-afa0-f151859cfdd6-kube-api-access-wphf6\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.973275 4777 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c81e09cd-794a-4752-afa0-f151859cfdd6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:02 crc kubenswrapper[4777]: I0216 21:45:02.973452 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c81e09cd-794a-4752-afa0-f151859cfdd6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:03 crc kubenswrapper[4777]: I0216 21:45:03.319815 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" event={"ID":"c81e09cd-794a-4752-afa0-f151859cfdd6","Type":"ContainerDied","Data":"150ec54b574a50f99f94405717879c46a348099eb28ae81c3820149e57a8b47c"} Feb 16 21:45:03 crc kubenswrapper[4777]: I0216 21:45:03.319893 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="150ec54b574a50f99f94405717879c46a348099eb28ae81c3820149e57a8b47c" Feb 16 21:45:03 crc kubenswrapper[4777]: I0216 21:45:03.319895 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.359299 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" podUID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" containerName="registry" containerID="cri-o://d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795" gracePeriod=30 Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.801043 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.960834 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.960938 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961034 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961076 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961118 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961180 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961422 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.961487 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrnd\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd\") pod \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\" (UID: \"aaaf3fb4-0bfd-4a29-aebe-d364790f620b\") " Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.962870 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.963448 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.970127 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.970185 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.971564 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.975075 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd" (OuterVolumeSpecName: "kube-api-access-6rrnd") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "kube-api-access-6rrnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.976816 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:45:20 crc kubenswrapper[4777]: I0216 21:45:20.997134 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aaaf3fb4-0bfd-4a29-aebe-d364790f620b" (UID: "aaaf3fb4-0bfd-4a29-aebe-d364790f620b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063040 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrnd\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-kube-api-access-6rrnd\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063100 4777 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063119 4777 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063138 4777 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063156 4777 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063178 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.063195 4777 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aaaf3fb4-0bfd-4a29-aebe-d364790f620b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.448055 4777 generic.go:334] "Generic (PLEG): container finished" podID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" containerID="d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795" exitCode=0 Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.448121 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" event={"ID":"aaaf3fb4-0bfd-4a29-aebe-d364790f620b","Type":"ContainerDied","Data":"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795"} Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.448171 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" event={"ID":"aaaf3fb4-0bfd-4a29-aebe-d364790f620b","Type":"ContainerDied","Data":"7d98676c47149ba10df8d75b1df5ab30249cd49e4125bb00928fbd4c32195eba"} Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.448170 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gdkjm" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.448220 4777 scope.go:117] "RemoveContainer" containerID="d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.479282 4777 scope.go:117] "RemoveContainer" containerID="d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795" Feb 16 21:45:21 crc kubenswrapper[4777]: E0216 21:45:21.480128 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795\": container with ID starting with d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795 not found: ID does not exist" containerID="d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.480214 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795"} err="failed to get container status \"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795\": rpc error: code = NotFound desc = could not find container \"d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795\": container with ID starting with d57e9c07c2b6a178927e3b60128cefefdbfc7b2b30ec09a2b18a4524dd103795 not found: ID does not exist" Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.498116 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:45:21 crc kubenswrapper[4777]: I0216 21:45:21.503960 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gdkjm"] Feb 16 21:45:22 crc kubenswrapper[4777]: I0216 21:45:22.193812 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" path="/var/lib/kubelet/pods/aaaf3fb4-0bfd-4a29-aebe-d364790f620b/volumes" Feb 16 21:46:41 crc kubenswrapper[4777]: I0216 21:46:41.652425 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:46:41 crc kubenswrapper[4777]: I0216 21:46:41.653158 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:47:11 crc kubenswrapper[4777]: I0216 21:47:11.651936 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:47:11 crc kubenswrapper[4777]: I0216 21:47:11.653555 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:47:41 crc kubenswrapper[4777]: I0216 21:47:41.652140 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:47:41 crc kubenswrapper[4777]: I0216 21:47:41.653074 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:47:41 crc kubenswrapper[4777]: I0216 21:47:41.653169 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:47:41 crc kubenswrapper[4777]: I0216 21:47:41.654118 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:47:41 crc kubenswrapper[4777]: I0216 21:47:41.654258 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7" gracePeriod=600 Feb 16 21:47:42 crc kubenswrapper[4777]: I0216 21:47:42.442414 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7" exitCode=0 Feb 16 21:47:42 crc kubenswrapper[4777]: I0216 21:47:42.442507 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7"} Feb 16 21:47:42 crc kubenswrapper[4777]: I0216 21:47:42.443874 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857"} Feb 16 21:47:42 crc kubenswrapper[4777]: I0216 21:47:42.443936 4777 scope.go:117] "RemoveContainer" containerID="7667578f694ea6d5a67542ec47257b4fa8954f0c5a3bee2ecf97f6ec3a597dc9" Feb 16 21:49:00 crc kubenswrapper[4777]: I0216 21:49:00.481179 4777 scope.go:117] "RemoveContainer" containerID="f3e389f78403ebe8b0abb95cf04041926a80fc1e389a87164f235f7e510a758d" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.779835 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw"] Feb 16 21:49:25 crc kubenswrapper[4777]: E0216 21:49:25.780920 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e09cd-794a-4752-afa0-f151859cfdd6" containerName="collect-profiles" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.780946 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e09cd-794a-4752-afa0-f151859cfdd6" containerName="collect-profiles" Feb 16 21:49:25 crc kubenswrapper[4777]: E0216 21:49:25.780964 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" containerName="registry" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.780977 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" containerName="registry" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.781167 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaaf3fb4-0bfd-4a29-aebe-d364790f620b" containerName="registry" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.781194 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81e09cd-794a-4752-afa0-f151859cfdd6" containerName="collect-profiles" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.783121 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.786386 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.794693 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw"] Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.890124 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.890211 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.890445 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb49z\" (UniqueName: \"kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.992977 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.993087 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb49z\" (UniqueName: \"kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.993167 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.993950 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:25 crc kubenswrapper[4777]: I0216 21:49:25.994041 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:26 crc kubenswrapper[4777]: I0216 21:49:26.032272 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb49z\" (UniqueName: \"kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:26 crc kubenswrapper[4777]: I0216 21:49:26.102977 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:26 crc kubenswrapper[4777]: I0216 21:49:26.417707 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw"] Feb 16 21:49:27 crc kubenswrapper[4777]: I0216 21:49:27.190946 4777 generic.go:334] "Generic (PLEG): container finished" podID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerID="836e43226f593cfd9869cb68e0d99b0189790c041e48889ed14bce9eb07c7363" exitCode=0 Feb 16 21:49:27 crc kubenswrapper[4777]: I0216 21:49:27.191052 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" event={"ID":"2bf34f71-118c-4f2c-9e6c-2a172dd903c4","Type":"ContainerDied","Data":"836e43226f593cfd9869cb68e0d99b0189790c041e48889ed14bce9eb07c7363"} Feb 16 21:49:27 crc kubenswrapper[4777]: I0216 21:49:27.191339 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" event={"ID":"2bf34f71-118c-4f2c-9e6c-2a172dd903c4","Type":"ContainerStarted","Data":"0bc984d456d8cdd8e6a62355d11c52bb56496dbf5a8e7c0a750c1ac9996eabe3"} Feb 16 21:49:27 crc kubenswrapper[4777]: I0216 21:49:27.193927 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 21:49:29 crc kubenswrapper[4777]: I0216 21:49:29.211384 4777 generic.go:334] "Generic (PLEG): container finished" podID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerID="58b4bfb7e89c19baef3635d494db6151ebfa2e15955afb8bd82e778a73093bee" exitCode=0 Feb 16 21:49:29 crc kubenswrapper[4777]: I0216 21:49:29.211859 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" event={"ID":"2bf34f71-118c-4f2c-9e6c-2a172dd903c4","Type":"ContainerDied","Data":"58b4bfb7e89c19baef3635d494db6151ebfa2e15955afb8bd82e778a73093bee"} Feb 16 21:49:30 crc kubenswrapper[4777]: I0216 21:49:30.221757 4777 generic.go:334] "Generic (PLEG): container finished" podID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerID="fe70ada293a123edce7043bf977d2c41ed934b28dc42963495c5a2ca61f22d53" exitCode=0 Feb 16 21:49:30 crc kubenswrapper[4777]: I0216 21:49:30.221886 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" event={"ID":"2bf34f71-118c-4f2c-9e6c-2a172dd903c4","Type":"ContainerDied","Data":"fe70ada293a123edce7043bf977d2c41ed934b28dc42963495c5a2ca61f22d53"} Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.602161 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.698195 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb49z\" (UniqueName: \"kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z\") pod \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.698315 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util\") pod \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.698343 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle\") pod \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\" (UID: \"2bf34f71-118c-4f2c-9e6c-2a172dd903c4\") " Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.701103 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle" (OuterVolumeSpecName: "bundle") pod "2bf34f71-118c-4f2c-9e6c-2a172dd903c4" (UID: "2bf34f71-118c-4f2c-9e6c-2a172dd903c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.705253 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z" (OuterVolumeSpecName: "kube-api-access-qb49z") pod "2bf34f71-118c-4f2c-9e6c-2a172dd903c4" (UID: "2bf34f71-118c-4f2c-9e6c-2a172dd903c4"). InnerVolumeSpecName "kube-api-access-qb49z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.800661 4777 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.800760 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb49z\" (UniqueName: \"kubernetes.io/projected/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-kube-api-access-qb49z\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:31 crc kubenswrapper[4777]: I0216 21:49:31.968815 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util" (OuterVolumeSpecName: "util") pod "2bf34f71-118c-4f2c-9e6c-2a172dd903c4" (UID: "2bf34f71-118c-4f2c-9e6c-2a172dd903c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:49:32 crc kubenswrapper[4777]: I0216 21:49:32.002522 4777 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bf34f71-118c-4f2c-9e6c-2a172dd903c4-util\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:32 crc kubenswrapper[4777]: I0216 21:49:32.253027 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" event={"ID":"2bf34f71-118c-4f2c-9e6c-2a172dd903c4","Type":"ContainerDied","Data":"0bc984d456d8cdd8e6a62355d11c52bb56496dbf5a8e7c0a750c1ac9996eabe3"} Feb 16 21:49:32 crc kubenswrapper[4777]: I0216 21:49:32.253107 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc984d456d8cdd8e6a62355d11c52bb56496dbf5a8e7c0a750c1ac9996eabe3" Feb 16 21:49:32 crc kubenswrapper[4777]: I0216 21:49:32.253216 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.248798 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h"] Feb 16 21:49:39 crc kubenswrapper[4777]: E0216 21:49:39.249598 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="pull" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.249612 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="pull" Feb 16 21:49:39 crc kubenswrapper[4777]: E0216 21:49:39.249627 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="util" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.249633 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="util" Feb 16 21:49:39 crc kubenswrapper[4777]: E0216 21:49:39.249645 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="extract" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.249652 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="extract" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.249757 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf34f71-118c-4f2c-9e6c-2a172dd903c4" containerName="extract" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.250159 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.253555 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.253796 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-r2rzw" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.258092 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.263858 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.292409 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.293089 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.295255 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-b5l4s" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.295661 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.309696 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.310676 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.313197 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.330225 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.344439 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vbj\" (UniqueName: \"kubernetes.io/projected/48b27685-17d6-45ee-a71a-965bac61e90c-kube-api-access-t7vbj\") pod \"obo-prometheus-operator-68bc856cb9-7nx5h\" (UID: \"48b27685-17d6-45ee-a71a-965bac61e90c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.344509 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.344600 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.344689 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.344743 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.447037 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.448202 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.448263 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.448906 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.448977 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vbj\" (UniqueName: \"kubernetes.io/projected/48b27685-17d6-45ee-a71a-965bac61e90c-kube-api-access-t7vbj\") pod \"obo-prometheus-operator-68bc856cb9-7nx5h\" (UID: \"48b27685-17d6-45ee-a71a-965bac61e90c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.453827 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.461497 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.462387 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2deed2d-e90d-4062-817f-524d5413831d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b\" (UID: \"c2deed2d-e90d-4062-817f-524d5413831d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.477614 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5f5e98cb-6704-49a9-93b5-4158dbddbb58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc\" (UID: \"5f5e98cb-6704-49a9-93b5-4158dbddbb58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.481150 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4ljm7"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.481841 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.486230 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.486453 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lsw26" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.494871 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4ljm7"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.495727 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vbj\" (UniqueName: \"kubernetes.io/projected/48b27685-17d6-45ee-a71a-965bac61e90c-kube-api-access-t7vbj\") pod \"obo-prometheus-operator-68bc856cb9-7nx5h\" (UID: \"48b27685-17d6-45ee-a71a-965bac61e90c\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.549931 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.550142 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsd52\" (UniqueName: \"kubernetes.io/projected/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-kube-api-access-wsd52\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.567143 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.608398 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.623910 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.651461 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.651518 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsd52\" (UniqueName: \"kubernetes.io/projected/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-kube-api-access-wsd52\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.657608 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-observability-operator-tls\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.678455 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsd52\" (UniqueName: \"kubernetes.io/projected/995fa122-9f83-4b7c-97ce-4e1cfeb76b29-kube-api-access-wsd52\") pod \"observability-operator-59bdc8b94-4ljm7\" (UID: \"995fa122-9f83-4b7c-97ce-4e1cfeb76b29\") " pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.687366 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-zktjn"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.688245 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.690903 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-bjf6z" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.700403 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-zktjn"] Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.753225 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rtw\" (UniqueName: \"kubernetes.io/projected/0997c6ee-fff0-48ed-8234-72a4dde5f326-kube-api-access-l8rtw\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.753301 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0997c6ee-fff0-48ed-8234-72a4dde5f326-openshift-service-ca\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.832610 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.854472 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rtw\" (UniqueName: \"kubernetes.io/projected/0997c6ee-fff0-48ed-8234-72a4dde5f326-kube-api-access-l8rtw\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.854519 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0997c6ee-fff0-48ed-8234-72a4dde5f326-openshift-service-ca\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.855319 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0997c6ee-fff0-48ed-8234-72a4dde5f326-openshift-service-ca\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:39 crc kubenswrapper[4777]: I0216 21:49:39.871986 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rtw\" (UniqueName: \"kubernetes.io/projected/0997c6ee-fff0-48ed-8234-72a4dde5f326-kube-api-access-l8rtw\") pod \"perses-operator-5bf474d74f-zktjn\" (UID: \"0997c6ee-fff0-48ed-8234-72a4dde5f326\") " pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.011294 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.085242 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h"] Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.093927 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-4ljm7"] Feb 16 21:49:40 crc kubenswrapper[4777]: W0216 21:49:40.101061 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b27685_17d6_45ee_a71a_965bac61e90c.slice/crio-15d25e08b43fcac7a0a06a313f9d83afad5bd3c260edc47aa032b8ce94b0581b WatchSource:0}: Error finding container 15d25e08b43fcac7a0a06a313f9d83afad5bd3c260edc47aa032b8ce94b0581b: Status 404 returned error can't find the container with id 15d25e08b43fcac7a0a06a313f9d83afad5bd3c260edc47aa032b8ce94b0581b Feb 16 21:49:40 crc kubenswrapper[4777]: W0216 21:49:40.101423 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995fa122_9f83_4b7c_97ce_4e1cfeb76b29.slice/crio-dd11e803a5fe3913dbafbc7a3992037cf8f862665cc9954f152f3aaabe647213 WatchSource:0}: Error finding container dd11e803a5fe3913dbafbc7a3992037cf8f862665cc9954f152f3aaabe647213: Status 404 returned error can't find the container with id dd11e803a5fe3913dbafbc7a3992037cf8f862665cc9954f152f3aaabe647213 Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.153282 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b"] Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.166503 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc"] Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.294163 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" event={"ID":"c2deed2d-e90d-4062-817f-524d5413831d","Type":"ContainerStarted","Data":"e26a78079c6962ee6c70ab9d34d70f77cc855219e355794e1946c81cd6906702"} Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.296089 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" event={"ID":"995fa122-9f83-4b7c-97ce-4e1cfeb76b29","Type":"ContainerStarted","Data":"dd11e803a5fe3913dbafbc7a3992037cf8f862665cc9954f152f3aaabe647213"} Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.297101 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" event={"ID":"48b27685-17d6-45ee-a71a-965bac61e90c","Type":"ContainerStarted","Data":"15d25e08b43fcac7a0a06a313f9d83afad5bd3c260edc47aa032b8ce94b0581b"} Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.298767 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" event={"ID":"5f5e98cb-6704-49a9-93b5-4158dbddbb58","Type":"ContainerStarted","Data":"5e71eb6eb3d1f4e07032d74c39cc31e5057f4725f3773e2fa71a77529e8bf3a7"} Feb 16 21:49:40 crc kubenswrapper[4777]: I0216 21:49:40.587000 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-zktjn"] Feb 16 21:49:41 crc kubenswrapper[4777]: I0216 21:49:41.305496 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" event={"ID":"0997c6ee-fff0-48ed-8234-72a4dde5f326","Type":"ContainerStarted","Data":"b17121c18e94eabebadc0dfc0b63d995778002ff6977b1b37979f3a6a311af85"} Feb 16 21:49:41 crc kubenswrapper[4777]: I0216 21:49:41.656814 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:49:41 crc kubenswrapper[4777]: I0216 21:49:41.656903 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.370967 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" event={"ID":"c2deed2d-e90d-4062-817f-524d5413831d","Type":"ContainerStarted","Data":"b0ccf01974e258ee4b8113b033c42a2fcdcec0fb3c87935056f159299dfdcdf0"} Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.373176 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" event={"ID":"995fa122-9f83-4b7c-97ce-4e1cfeb76b29","Type":"ContainerStarted","Data":"d1ffec6188b3d4a09eedd349e366112c5b45d172994bbe6e44fe1c9d7de486bd"} Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.373524 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.375772 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" event={"ID":"0997c6ee-fff0-48ed-8234-72a4dde5f326","Type":"ContainerStarted","Data":"80df35060e7d8a75a63952c5b75e1b0ae250a29bc71cdbd1f2c927387c19014c"} Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.375893 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.377886 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" event={"ID":"48b27685-17d6-45ee-a71a-965bac61e90c","Type":"ContainerStarted","Data":"f9be948db2cf8dff20b66a61dc5357c74d2deca73eb1b93b1730623e081d5df7"} Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.380847 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" event={"ID":"5f5e98cb-6704-49a9-93b5-4158dbddbb58","Type":"ContainerStarted","Data":"2b3d7b53cc52e5152851833bb657e645c1f9e0acae3b8d5db2302c0d8785623a"} Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.392700 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b" podStartSLOduration=2.093529165 podStartE2EDuration="12.392675714s" podCreationTimestamp="2026-02-16 21:49:39 +0000 UTC" firstStartedPulling="2026-02-16 21:49:40.166638122 +0000 UTC m=+700.749139224" lastFinishedPulling="2026-02-16 21:49:50.465784661 +0000 UTC m=+711.048285773" observedRunningTime="2026-02-16 21:49:51.39003825 +0000 UTC m=+711.972539392" watchObservedRunningTime="2026-02-16 21:49:51.392675714 +0000 UTC m=+711.975176816" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.407226 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.417524 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc" podStartSLOduration=2.155341128 podStartE2EDuration="12.41750018s" podCreationTimestamp="2026-02-16 21:49:39 +0000 UTC" firstStartedPulling="2026-02-16 21:49:40.225644646 +0000 UTC m=+700.808145748" lastFinishedPulling="2026-02-16 21:49:50.487803698 +0000 UTC m=+711.070304800" observedRunningTime="2026-02-16 21:49:51.415523164 +0000 UTC m=+711.998024266" watchObservedRunningTime="2026-02-16 21:49:51.41750018 +0000 UTC m=+712.000001282" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.442700 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nx5h" podStartSLOduration=2.031424555 podStartE2EDuration="12.442672485s" podCreationTimestamp="2026-02-16 21:49:39 +0000 UTC" firstStartedPulling="2026-02-16 21:49:40.103417441 +0000 UTC m=+700.685918563" lastFinishedPulling="2026-02-16 21:49:50.514665391 +0000 UTC m=+711.097166493" observedRunningTime="2026-02-16 21:49:51.438587721 +0000 UTC m=+712.021088823" watchObservedRunningTime="2026-02-16 21:49:51.442672485 +0000 UTC m=+712.025173597" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.471014 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-4ljm7" podStartSLOduration=2.060443779 podStartE2EDuration="12.470993619s" podCreationTimestamp="2026-02-16 21:49:39 +0000 UTC" firstStartedPulling="2026-02-16 21:49:40.103766621 +0000 UTC m=+700.686267723" lastFinishedPulling="2026-02-16 21:49:50.514316461 +0000 UTC m=+711.096817563" observedRunningTime="2026-02-16 21:49:51.467870201 +0000 UTC m=+712.050371303" watchObservedRunningTime="2026-02-16 21:49:51.470993619 +0000 UTC m=+712.053494721" Feb 16 21:49:51 crc kubenswrapper[4777]: I0216 21:49:51.487125 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" podStartSLOduration=2.573982199 podStartE2EDuration="12.48711118s" podCreationTimestamp="2026-02-16 21:49:39 +0000 UTC" firstStartedPulling="2026-02-16 21:49:40.600418678 +0000 UTC m=+701.182919780" lastFinishedPulling="2026-02-16 21:49:50.513547659 +0000 UTC m=+711.096048761" observedRunningTime="2026-02-16 21:49:51.486241656 +0000 UTC m=+712.068742758" watchObservedRunningTime="2026-02-16 21:49:51.48711118 +0000 UTC m=+712.069612282" Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.733785 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w27qk"] Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734640 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="nbdb" containerID="cri-o://9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734805 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="sbdb" containerID="cri-o://34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734769 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734867 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="northd" containerID="cri-o://972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734912 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-node" containerID="cri-o://63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.735000 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-acl-logging" containerID="cri-o://955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.734608 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-controller" containerID="cri-o://6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a" gracePeriod=30 Feb 16 21:49:53 crc kubenswrapper[4777]: I0216 21:49:53.772963 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" containerID="cri-o://9495662231ebaa035e6874d5dcb56c062e94dc2af111acff6f4be7588f548456" gracePeriod=30 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.404512 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovnkube-controller/3.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.409203 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-acl-logging/0.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.409754 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-controller/0.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410264 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="9495662231ebaa035e6874d5dcb56c062e94dc2af111acff6f4be7588f548456" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410296 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410309 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410320 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410329 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410342 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936" exitCode=0 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410350 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d" exitCode=143 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410359 4777 generic.go:334] "Generic (PLEG): container finished" podID="a3c293d7-2d38-4047-a104-7f354aebf216" containerID="6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a" exitCode=143 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410340 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"9495662231ebaa035e6874d5dcb56c062e94dc2af111acff6f4be7588f548456"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410418 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410435 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410459 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410472 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410483 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410495 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410506 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.410523 4777 scope.go:117] "RemoveContainer" containerID="2f5d6bdfd4b616b01a8433ba65f1407404c034f6b6cf74d2decdc6e2ffc39993" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.412342 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/2.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.413021 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/1.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.413073 4777 generic.go:334] "Generic (PLEG): container finished" podID="71656da7-4f33-419d-aaba-93bf9158f706" containerID="8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579" exitCode=2 Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.413116 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerDied","Data":"8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579"} Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.413669 4777 scope.go:117] "RemoveContainer" containerID="8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.414003 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vpf28_openshift-multus(71656da7-4f33-419d-aaba-93bf9158f706)\"" pod="openshift-multus/multus-vpf28" podUID="71656da7-4f33-419d-aaba-93bf9158f706" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.430415 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-acl-logging/0.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.430899 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-controller/0.log" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.431348 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.435931 4777 scope.go:117] "RemoveContainer" containerID="0b7c71b79ec54c7373971886c4f36d1a4e31d558dbecbb5925db4d29961baeb7" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477621 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477679 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477698 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477762 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477798 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2hhz\" (UniqueName: \"kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477854 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477888 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477943 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.477975 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478014 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478036 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478059 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478082 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478101 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478129 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478151 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478168 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478210 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478246 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478285 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket\") pod \"a3c293d7-2d38-4047-a104-7f354aebf216\" (UID: \"a3c293d7-2d38-4047-a104-7f354aebf216\") " Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478463 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478509 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478579 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478603 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478659 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log" (OuterVolumeSpecName: "node-log") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478685 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478664 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket" (OuterVolumeSpecName: "log-socket") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478737 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478772 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash" (OuterVolumeSpecName: "host-slash") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478803 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478834 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478809 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478988 4777 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.479011 4777 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.479026 4777 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.479070 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478503 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.479128 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.479385 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.478406 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.486546 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.486694 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz" (OuterVolumeSpecName: "kube-api-access-k2hhz") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "kube-api-access-k2hhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.500515 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a3c293d7-2d38-4047-a104-7f354aebf216" (UID: "a3c293d7-2d38-4047-a104-7f354aebf216"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505320 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bzjc4"] Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505582 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kubecfg-setup" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505602 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kubecfg-setup" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505612 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505620 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505633 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505641 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505649 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="sbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505655 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="sbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505662 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505668 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505679 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505688 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505700 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505707 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505729 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-node" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505735 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-node" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505744 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="nbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505752 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="nbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505761 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-acl-logging" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505767 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-acl-logging" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.505775 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="northd" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505781 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="northd" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505971 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505982 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.505992 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="nbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506000 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="sbdb" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506008 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506016 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-ovn-metrics" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506025 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="kube-rbac-proxy-node" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506032 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="northd" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506043 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506049 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovn-acl-logging" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.506139 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506146 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: E0216 21:49:54.506154 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506160 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506258 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.506268 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" containerName="ovnkube-controller" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.508600 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579636 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579677 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-script-lib\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579707 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-systemd-units\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579737 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-slash\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579810 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-kubelet\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579857 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-ovn\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579882 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-etc-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.579904 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5jx\" (UniqueName: \"kubernetes.io/projected/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-kube-api-access-6c5jx\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580031 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580050 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-env-overrides\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580065 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-node-log\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580081 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-netns\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580097 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-var-lib-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580112 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-config\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580133 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-systemd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580150 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-netd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580165 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-log-socket\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580188 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-bin\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580210 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580229 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovn-node-metrics-cert\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580275 4777 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3c293d7-2d38-4047-a104-7f354aebf216-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580288 4777 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-node-log\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580297 4777 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580349 4777 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580380 4777 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-log-socket\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580393 4777 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580405 4777 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580417 4777 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-slash\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580433 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2hhz\" (UniqueName: \"kubernetes.io/projected/a3c293d7-2d38-4047-a104-7f354aebf216-kube-api-access-k2hhz\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580447 4777 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580460 4777 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580472 4777 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580484 4777 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580495 4777 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580506 4777 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580518 4777 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c293d7-2d38-4047-a104-7f354aebf216-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.580530 4777 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3c293d7-2d38-4047-a104-7f354aebf216-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682210 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-netd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682271 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-log-socket\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682315 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-bin\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682344 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682376 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovn-node-metrics-cert\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682410 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682409 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-netd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682436 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-script-lib\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682432 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-log-socket\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682467 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-systemd-units\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682478 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-cni-bin\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682525 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682539 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-slash\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682490 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-slash\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682593 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-kubelet\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682598 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682615 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-ovn\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682639 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-etc-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682646 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-systemd-units\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682659 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5jx\" (UniqueName: \"kubernetes.io/projected/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-kube-api-access-6c5jx\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682684 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682691 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-ovn\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682730 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-env-overrides\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682795 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-node-log\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682841 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-netns\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682869 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-var-lib-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682903 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-config\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.682941 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-systemd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683038 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-netns\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683117 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-etc-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683150 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-run-systemd\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683186 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-var-lib-openvswitch\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683363 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-run-ovn-kubernetes\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683415 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-node-log\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683443 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-host-kubelet\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683470 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-env-overrides\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683572 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-script-lib\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.683795 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovnkube-config\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.686908 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-ovn-node-metrics-cert\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.726094 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5jx\" (UniqueName: \"kubernetes.io/projected/d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609-kube-api-access-6c5jx\") pod \"ovnkube-node-bzjc4\" (UID: \"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609\") " pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: I0216 21:49:54.824670 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:49:54 crc kubenswrapper[4777]: W0216 21:49:54.865980 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69bb86a_b5ff_4c0e_b0f9_3edbd83e6609.slice/crio-c9f7250abe346f863d15c5b7e695a773d3c4219f5f80b45c7d9e0b154f884251 WatchSource:0}: Error finding container c9f7250abe346f863d15c5b7e695a773d3c4219f5f80b45c7d9e0b154f884251: Status 404 returned error can't find the container with id c9f7250abe346f863d15c5b7e695a773d3c4219f5f80b45c7d9e0b154f884251 Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.420866 4777 generic.go:334] "Generic (PLEG): container finished" podID="d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609" containerID="b6dd64a2369f4e1fc082ec19a78274b613f4cb13acebca43f3deec45edae89d6" exitCode=0 Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.420953 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerDied","Data":"b6dd64a2369f4e1fc082ec19a78274b613f4cb13acebca43f3deec45edae89d6"} Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.421556 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"c9f7250abe346f863d15c5b7e695a773d3c4219f5f80b45c7d9e0b154f884251"} Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.426408 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-acl-logging/0.log" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.426890 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w27qk_a3c293d7-2d38-4047-a104-7f354aebf216/ovn-controller/0.log" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.428253 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" event={"ID":"a3c293d7-2d38-4047-a104-7f354aebf216","Type":"ContainerDied","Data":"7c449b5c07ecbf291af601994f4937b3f67315035262dbb45b66b3e7bd00103f"} Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.428299 4777 scope.go:117] "RemoveContainer" containerID="9495662231ebaa035e6874d5dcb56c062e94dc2af111acff6f4be7588f548456" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.428445 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w27qk" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.440694 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/2.log" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.527042 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w27qk"] Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.534233 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w27qk"] Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.708151 4777 scope.go:117] "RemoveContainer" containerID="34a175678e46c77a7100994c47f2fdc511e602b83c2f4059f1884d909ceaa817" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.736934 4777 scope.go:117] "RemoveContainer" containerID="9480e74c2a033e4de35bbcdd69883c33fba96db740fe161f7f39bd46b33180aa" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.771952 4777 scope.go:117] "RemoveContainer" containerID="972136ad128b98cf1879a85cb3b0d978e51724de8993dd18415e30cee78f6f65" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.808827 4777 scope.go:117] "RemoveContainer" containerID="6f924a4d6c2eab4e191e7dfd65cc59c81275916f296537b258df815e2de7815b" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.844589 4777 scope.go:117] "RemoveContainer" containerID="63c5422bf58ae6f302dd40b0a6368f06d25f904e6ec121b3b18f328009cc8936" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.879420 4777 scope.go:117] "RemoveContainer" containerID="955ed8ec25d20dbb04c888f64f7a020e17847530dcb70b4b8a71262b6878045d" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.896641 4777 scope.go:117] "RemoveContainer" containerID="6d64316acfd74c831148af12d6ca8b91542090106f3bd8ea709e4e1a1ddc669a" Feb 16 21:49:55 crc kubenswrapper[4777]: I0216 21:49:55.929977 4777 scope.go:117] "RemoveContainer" containerID="03c3e896b9b97611521fdc1a16c8c96db14e26078c91292a204c654b6e6435f3" Feb 16 21:49:56 crc kubenswrapper[4777]: I0216 21:49:56.195939 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c293d7-2d38-4047-a104-7f354aebf216" path="/var/lib/kubelet/pods/a3c293d7-2d38-4047-a104-7f354aebf216/volumes" Feb 16 21:49:56 crc kubenswrapper[4777]: I0216 21:49:56.457993 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"207e2304250b3b4098984969838d9fdd7ff6e9f7de889654367e4dff3fe28bb0"} Feb 16 21:49:56 crc kubenswrapper[4777]: I0216 21:49:56.458383 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"ce836c19fe970033faea3c9e8be2c5d79c0df80fe50ebdcd51ce84b88a4add01"} Feb 16 21:49:56 crc kubenswrapper[4777]: I0216 21:49:56.458393 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"ecf1e183c82ad2cbc1323724351a47f8cc61448c1dceb692c331763ee2b4fa13"} Feb 16 21:49:56 crc kubenswrapper[4777]: I0216 21:49:56.458404 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"04e7f251584b579cace0e0ebd4fc377fec32a1392f08a5b67e038dc52ea43d40"} Feb 16 21:49:57 crc kubenswrapper[4777]: I0216 21:49:57.468786 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"4269fe143c98a6e1793255ad4de93b0e8227a23f9dd64e428ecae64738c73cc2"} Feb 16 21:49:57 crc kubenswrapper[4777]: I0216 21:49:57.468860 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"57f925855e10c0f0e90d5901ac3b3b58a84023e92130a9ff3a46d053f79d27ff"} Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.362706 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn"] Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.364028 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.366355 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.366736 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.369934 4777 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-b6gqw" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.383066 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-kgs5x"] Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.383761 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.385641 4777 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4wqhv" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.413319 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6q4m2"] Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.414001 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.416707 4777 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mhs5n" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.456409 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gkng\" (UniqueName: \"kubernetes.io/projected/5f765dd2-4db9-46b8-8914-62e18e339d59-kube-api-access-8gkng\") pod \"cert-manager-cainjector-cf98fcc89-lsvsn\" (UID: \"5f765dd2-4db9-46b8-8914-62e18e339d59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.456457 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r96p\" (UniqueName: \"kubernetes.io/projected/4591e319-88ea-472b-8adc-1aa253262a37-kube-api-access-9r96p\") pod \"cert-manager-webhook-687f57d79b-6q4m2\" (UID: \"4591e319-88ea-472b-8adc-1aa253262a37\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.456627 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8xl\" (UniqueName: \"kubernetes.io/projected/658c82ce-4722-4465-a72b-f9d8234c286d-kube-api-access-dq8xl\") pod \"cert-manager-858654f9db-kgs5x\" (UID: \"658c82ce-4722-4465-a72b-f9d8234c286d\") " pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.482829 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"685abd69dfcb416b5de997e883bf92d5765f78a0d2a2b845b00303651b48cb54"} Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.558017 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gkng\" (UniqueName: \"kubernetes.io/projected/5f765dd2-4db9-46b8-8914-62e18e339d59-kube-api-access-8gkng\") pod \"cert-manager-cainjector-cf98fcc89-lsvsn\" (UID: \"5f765dd2-4db9-46b8-8914-62e18e339d59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.558074 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r96p\" (UniqueName: \"kubernetes.io/projected/4591e319-88ea-472b-8adc-1aa253262a37-kube-api-access-9r96p\") pod \"cert-manager-webhook-687f57d79b-6q4m2\" (UID: \"4591e319-88ea-472b-8adc-1aa253262a37\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.558519 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8xl\" (UniqueName: \"kubernetes.io/projected/658c82ce-4722-4465-a72b-f9d8234c286d-kube-api-access-dq8xl\") pod \"cert-manager-858654f9db-kgs5x\" (UID: \"658c82ce-4722-4465-a72b-f9d8234c286d\") " pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.578608 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gkng\" (UniqueName: \"kubernetes.io/projected/5f765dd2-4db9-46b8-8914-62e18e339d59-kube-api-access-8gkng\") pod \"cert-manager-cainjector-cf98fcc89-lsvsn\" (UID: \"5f765dd2-4db9-46b8-8914-62e18e339d59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.579210 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8xl\" (UniqueName: \"kubernetes.io/projected/658c82ce-4722-4465-a72b-f9d8234c286d-kube-api-access-dq8xl\") pod \"cert-manager-858654f9db-kgs5x\" (UID: \"658c82ce-4722-4465-a72b-f9d8234c286d\") " pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.581883 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r96p\" (UniqueName: \"kubernetes.io/projected/4591e319-88ea-472b-8adc-1aa253262a37-kube-api-access-9r96p\") pod \"cert-manager-webhook-687f57d79b-6q4m2\" (UID: \"4591e319-88ea-472b-8adc-1aa253262a37\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.683119 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.706584 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: I0216 21:49:59.731078 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.731350 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(8d4df08a6d590d387533e9c98938af80da19dfa928ccf92c1c88f0f86a9a12d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.731463 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(8d4df08a6d590d387533e9c98938af80da19dfa928ccf92c1c88f0f86a9a12d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.731487 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(8d4df08a6d590d387533e9c98938af80da19dfa928ccf92c1c88f0f86a9a12d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.731541 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(8d4df08a6d590d387533e9c98938af80da19dfa928ccf92c1c88f0f86a9a12d3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" podUID="5f765dd2-4db9-46b8-8914-62e18e339d59" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.762098 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(ba91694fa9bf02b920e2bdb9f3068f6199bfd9363ccc59c988db9c67d9cce69d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.762416 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(ba91694fa9bf02b920e2bdb9f3068f6199bfd9363ccc59c988db9c67d9cce69d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.762439 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(ba91694fa9bf02b920e2bdb9f3068f6199bfd9363ccc59c988db9c67d9cce69d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.762489 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(ba91694fa9bf02b920e2bdb9f3068f6199bfd9363ccc59c988db9c67d9cce69d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-858654f9db-kgs5x" podUID="658c82ce-4722-4465-a72b-f9d8234c286d" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.769818 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(f9855ca552c9fbf917f8e37abe15ac6c8daa2280db2056d685f94fc2a6d42898): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.769906 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(f9855ca552c9fbf917f8e37abe15ac6c8daa2280db2056d685f94fc2a6d42898): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.769945 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(f9855ca552c9fbf917f8e37abe15ac6c8daa2280db2056d685f94fc2a6d42898): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:49:59 crc kubenswrapper[4777]: E0216 21:49:59.770660 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(f9855ca552c9fbf917f8e37abe15ac6c8daa2280db2056d685f94fc2a6d42898): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" podUID="4591e319-88ea-472b-8adc-1aa253262a37" Feb 16 21:50:00 crc kubenswrapper[4777]: I0216 21:50:00.014063 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-zktjn" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.506267 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" event={"ID":"d69bb86a-b5ff-4c0e-b0f9-3edbd83e6609","Type":"ContainerStarted","Data":"3a7eb941b1b52584d4030a0b43cb2ef75e5d4878e452c7c60897aab89b73d0bf"} Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.506894 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.507000 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.507096 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.538057 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" podStartSLOduration=9.538035709999999 podStartE2EDuration="9.53803571s" podCreationTimestamp="2026-02-16 21:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:50:03.533792111 +0000 UTC m=+724.116293213" watchObservedRunningTime="2026-02-16 21:50:03.53803571 +0000 UTC m=+724.120536812" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.559062 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:03 crc kubenswrapper[4777]: I0216 21:50:03.568777 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.210112 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn"] Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.210247 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.210643 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.246910 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(b9d2df930db93c77b5f23539ec701d95414db70131e6a1dd27d12a08d58087ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.246992 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(b9d2df930db93c77b5f23539ec701d95414db70131e6a1dd27d12a08d58087ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.247020 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(b9d2df930db93c77b5f23539ec701d95414db70131e6a1dd27d12a08d58087ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.247066 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(b9d2df930db93c77b5f23539ec701d95414db70131e6a1dd27d12a08d58087ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" podUID="5f765dd2-4db9-46b8-8914-62e18e339d59" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.250316 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kgs5x"] Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.250472 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.251070 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.274528 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6q4m2"] Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.274672 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:04 crc kubenswrapper[4777]: I0216 21:50:04.275095 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.320682 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(e6a864fe71d8d51280f2ffd6da944c3f4013ac4b5634794e5bce46c0b74ed738): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.320773 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(e6a864fe71d8d51280f2ffd6da944c3f4013ac4b5634794e5bce46c0b74ed738): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.320804 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(e6a864fe71d8d51280f2ffd6da944c3f4013ac4b5634794e5bce46c0b74ed738): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.320853 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(e6a864fe71d8d51280f2ffd6da944c3f4013ac4b5634794e5bce46c0b74ed738): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-858654f9db-kgs5x" podUID="658c82ce-4722-4465-a72b-f9d8234c286d" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.344314 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(5cfee2b98fbf9503b2af1960a7791fad44f24d7af882e52566949de9c4efd746): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.344415 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(5cfee2b98fbf9503b2af1960a7791fad44f24d7af882e52566949de9c4efd746): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.344451 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(5cfee2b98fbf9503b2af1960a7791fad44f24d7af882e52566949de9c4efd746): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:04 crc kubenswrapper[4777]: E0216 21:50:04.344537 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(5cfee2b98fbf9503b2af1960a7791fad44f24d7af882e52566949de9c4efd746): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" podUID="4591e319-88ea-472b-8adc-1aa253262a37" Feb 16 21:50:09 crc kubenswrapper[4777]: I0216 21:50:09.181843 4777 scope.go:117] "RemoveContainer" containerID="8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579" Feb 16 21:50:09 crc kubenswrapper[4777]: E0216 21:50:09.182972 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vpf28_openshift-multus(71656da7-4f33-419d-aaba-93bf9158f706)\"" pod="openshift-multus/multus-vpf28" podUID="71656da7-4f33-419d-aaba-93bf9158f706" Feb 16 21:50:11 crc kubenswrapper[4777]: I0216 21:50:11.651633 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:50:11 crc kubenswrapper[4777]: I0216 21:50:11.651725 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:50:17 crc kubenswrapper[4777]: I0216 21:50:17.180944 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:17 crc kubenswrapper[4777]: I0216 21:50:17.181022 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:17 crc kubenswrapper[4777]: I0216 21:50:17.182698 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:17 crc kubenswrapper[4777]: I0216 21:50:17.187028 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.244777 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(c7760f68dee45a645d4413246068ea385233c54beda84696a4bf08cd159b79f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.245347 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(c7760f68dee45a645d4413246068ea385233c54beda84696a4bf08cd159b79f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.245408 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(c7760f68dee45a645d4413246068ea385233c54beda84696a4bf08cd159b79f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.245525 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager(5f765dd2-4db9-46b8-8914-62e18e339d59)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-cainjector-cf98fcc89-lsvsn_cert-manager_5f765dd2-4db9-46b8-8914-62e18e339d59_0(c7760f68dee45a645d4413246068ea385233c54beda84696a4bf08cd159b79f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" podUID="5f765dd2-4db9-46b8-8914-62e18e339d59" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.253907 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(b9f19fe2f05d433e5c9c081d47a6feb473788471d56984a051b1cb85a5205f5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.254004 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(b9f19fe2f05d433e5c9c081d47a6feb473788471d56984a051b1cb85a5205f5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.254051 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(b9f19fe2f05d433e5c9c081d47a6feb473788471d56984a051b1cb85a5205f5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:17 crc kubenswrapper[4777]: E0216 21:50:17.254144 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-webhook-687f57d79b-6q4m2_cert-manager(4591e319-88ea-472b-8adc-1aa253262a37)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-webhook-687f57d79b-6q4m2_cert-manager_4591e319-88ea-472b-8adc-1aa253262a37_0(b9f19fe2f05d433e5c9c081d47a6feb473788471d56984a051b1cb85a5205f5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" podUID="4591e319-88ea-472b-8adc-1aa253262a37" Feb 16 21:50:18 crc kubenswrapper[4777]: I0216 21:50:18.182063 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:18 crc kubenswrapper[4777]: I0216 21:50:18.182576 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:18 crc kubenswrapper[4777]: E0216 21:50:18.211654 4777 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(156a85fb3d0615b64472d6af83674f53b143db767f5315687e1f615ab19c1012): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 16 21:50:18 crc kubenswrapper[4777]: E0216 21:50:18.211845 4777 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(156a85fb3d0615b64472d6af83674f53b143db767f5315687e1f615ab19c1012): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:18 crc kubenswrapper[4777]: E0216 21:50:18.211939 4777 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(156a85fb3d0615b64472d6af83674f53b143db767f5315687e1f615ab19c1012): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:18 crc kubenswrapper[4777]: E0216 21:50:18.212066 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cert-manager-858654f9db-kgs5x_cert-manager(658c82ce-4722-4465-a72b-f9d8234c286d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cert-manager-858654f9db-kgs5x_cert-manager_658c82ce-4722-4465-a72b-f9d8234c286d_0(156a85fb3d0615b64472d6af83674f53b143db767f5315687e1f615ab19c1012): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="cert-manager/cert-manager-858654f9db-kgs5x" podUID="658c82ce-4722-4465-a72b-f9d8234c286d" Feb 16 21:50:21 crc kubenswrapper[4777]: I0216 21:50:21.181768 4777 scope.go:117] "RemoveContainer" containerID="8e174a786a13a393184f0f1500ffa4bb237ec6327058a5eb9c93ed917f3b3579" Feb 16 21:50:21 crc kubenswrapper[4777]: I0216 21:50:21.663347 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpf28_71656da7-4f33-419d-aaba-93bf9158f706/kube-multus/2.log" Feb 16 21:50:21 crc kubenswrapper[4777]: I0216 21:50:21.663456 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpf28" event={"ID":"71656da7-4f33-419d-aaba-93bf9158f706","Type":"ContainerStarted","Data":"5f8ef96a53690c15a353fcb37d5db18f41d277a1d018fecca29dfae1125e6538"} Feb 16 21:50:24 crc kubenswrapper[4777]: I0216 21:50:24.861843 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bzjc4" Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.180833 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.180987 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.188595 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.188685 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kgs5x" Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.469905 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kgs5x"] Feb 16 21:50:30 crc kubenswrapper[4777]: W0216 21:50:30.474773 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod658c82ce_4722_4465_a72b_f9d8234c286d.slice/crio-f7dd5e8637d6162780f86a04085b151ddcd2b9a12fb335cbd10423e31213d894 WatchSource:0}: Error finding container f7dd5e8637d6162780f86a04085b151ddcd2b9a12fb335cbd10423e31213d894: Status 404 returned error can't find the container with id f7dd5e8637d6162780f86a04085b151ddcd2b9a12fb335cbd10423e31213d894 Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.521328 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6q4m2"] Feb 16 21:50:30 crc kubenswrapper[4777]: W0216 21:50:30.522323 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4591e319_88ea_472b_8adc_1aa253262a37.slice/crio-452820099f3bfaf7969644a61f3df64524c201a28c862db6af8a1149dc95bd30 WatchSource:0}: Error finding container 452820099f3bfaf7969644a61f3df64524c201a28c862db6af8a1149dc95bd30: Status 404 returned error can't find the container with id 452820099f3bfaf7969644a61f3df64524c201a28c862db6af8a1149dc95bd30 Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.730320 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kgs5x" event={"ID":"658c82ce-4722-4465-a72b-f9d8234c286d","Type":"ContainerStarted","Data":"f7dd5e8637d6162780f86a04085b151ddcd2b9a12fb335cbd10423e31213d894"} Feb 16 21:50:30 crc kubenswrapper[4777]: I0216 21:50:30.733694 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" event={"ID":"4591e319-88ea-472b-8adc-1aa253262a37","Type":"ContainerStarted","Data":"452820099f3bfaf7969644a61f3df64524c201a28c862db6af8a1149dc95bd30"} Feb 16 21:50:32 crc kubenswrapper[4777]: I0216 21:50:32.182886 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:32 crc kubenswrapper[4777]: I0216 21:50:32.183133 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" Feb 16 21:50:32 crc kubenswrapper[4777]: I0216 21:50:32.437347 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn"] Feb 16 21:50:33 crc kubenswrapper[4777]: I0216 21:50:33.756516 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" event={"ID":"5f765dd2-4db9-46b8-8914-62e18e339d59","Type":"ContainerStarted","Data":"1e2fa378b861a9e6528cb253ac5943e6d84c0dda8d90ab239ac3c4f34e32a3f3"} Feb 16 21:50:33 crc kubenswrapper[4777]: I0216 21:50:33.759152 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kgs5x" event={"ID":"658c82ce-4722-4465-a72b-f9d8234c286d","Type":"ContainerStarted","Data":"99f280b4877d7ccb0b77ad0f54bf0367fff64118b21e08acf1223a5b8aa1ef26"} Feb 16 21:50:33 crc kubenswrapper[4777]: I0216 21:50:33.760840 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" event={"ID":"4591e319-88ea-472b-8adc-1aa253262a37","Type":"ContainerStarted","Data":"2fbbd7dca0c186373222b71e94f0c14bd6bc513930648f778e1da3af88b15eef"} Feb 16 21:50:33 crc kubenswrapper[4777]: I0216 21:50:33.761014 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:33 crc kubenswrapper[4777]: I0216 21:50:33.782205 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-kgs5x" podStartSLOduration=31.700017874 podStartE2EDuration="34.782185928s" podCreationTimestamp="2026-02-16 21:49:59 +0000 UTC" firstStartedPulling="2026-02-16 21:50:30.477881227 +0000 UTC m=+751.060382329" lastFinishedPulling="2026-02-16 21:50:33.560049271 +0000 UTC m=+754.142550383" observedRunningTime="2026-02-16 21:50:33.777019073 +0000 UTC m=+754.359520195" watchObservedRunningTime="2026-02-16 21:50:33.782185928 +0000 UTC m=+754.364687030" Feb 16 21:50:34 crc kubenswrapper[4777]: I0216 21:50:34.772891 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" event={"ID":"5f765dd2-4db9-46b8-8914-62e18e339d59","Type":"ContainerStarted","Data":"a81f2389baa2bdc55fa21e9e32f96997dc5a0230b977bad892fa3bae6737278d"} Feb 16 21:50:34 crc kubenswrapper[4777]: I0216 21:50:34.804893 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" podStartSLOduration=32.78330297 podStartE2EDuration="35.80483076s" podCreationTimestamp="2026-02-16 21:49:59 +0000 UTC" firstStartedPulling="2026-02-16 21:50:30.526122275 +0000 UTC m=+751.108623377" lastFinishedPulling="2026-02-16 21:50:33.547650025 +0000 UTC m=+754.130151167" observedRunningTime="2026-02-16 21:50:33.794048319 +0000 UTC m=+754.376549431" watchObservedRunningTime="2026-02-16 21:50:34.80483076 +0000 UTC m=+755.387331902" Feb 16 21:50:34 crc kubenswrapper[4777]: I0216 21:50:34.810167 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-lsvsn" podStartSLOduration=34.281093518 podStartE2EDuration="35.809808339s" podCreationTimestamp="2026-02-16 21:49:59 +0000 UTC" firstStartedPulling="2026-02-16 21:50:32.775536153 +0000 UTC m=+753.358037255" lastFinishedPulling="2026-02-16 21:50:34.304250974 +0000 UTC m=+754.886752076" observedRunningTime="2026-02-16 21:50:34.792866426 +0000 UTC m=+755.375367578" watchObservedRunningTime="2026-02-16 21:50:34.809808339 +0000 UTC m=+755.392309471" Feb 16 21:50:39 crc kubenswrapper[4777]: I0216 21:50:39.735868 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-6q4m2" Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.651640 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.651785 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.651869 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.652811 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.652932 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857" gracePeriod=600 Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.830960 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857" exitCode=0 Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.831064 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857"} Feb 16 21:50:41 crc kubenswrapper[4777]: I0216 21:50:41.831684 4777 scope.go:117] "RemoveContainer" containerID="892b4a604d19cb7c92fac9c2a3fb4800ccf0e20737f719307eb1e9bf24c106b7" Feb 16 21:50:42 crc kubenswrapper[4777]: I0216 21:50:42.843306 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2"} Feb 16 21:50:44 crc kubenswrapper[4777]: I0216 21:50:44.643876 4777 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.083871 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78"] Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.086384 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.088412 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.097574 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78"] Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.132144 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.132203 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.132246 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5w8\" (UniqueName: \"kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.233337 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.233410 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.233472 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5w8\" (UniqueName: \"kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.234232 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.234305 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.268691 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5w8\" (UniqueName: \"kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.445614 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:03 crc kubenswrapper[4777]: I0216 21:51:03.720505 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78"] Feb 16 21:51:04 crc kubenswrapper[4777]: I0216 21:51:04.005414 4777 generic.go:334] "Generic (PLEG): container finished" podID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerID="1fe67f4e3852922c8d9895917933266044c6add2878cf873fe52bb646e6005b9" exitCode=0 Feb 16 21:51:04 crc kubenswrapper[4777]: I0216 21:51:04.005764 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" event={"ID":"007313eb-feab-4b33-a5f7-d0f6e842d722","Type":"ContainerDied","Data":"1fe67f4e3852922c8d9895917933266044c6add2878cf873fe52bb646e6005b9"} Feb 16 21:51:04 crc kubenswrapper[4777]: I0216 21:51:04.005860 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" event={"ID":"007313eb-feab-4b33-a5f7-d0f6e842d722","Type":"ContainerStarted","Data":"59a0098563edc5b8257a933bcbdb5806a22e8f6337be607f00141d93c02b5682"} Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.028306 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.029776 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.033313 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.033434 4777 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-ww87r" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.033313 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.050198 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.164507 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mw8\" (UniqueName: \"kubernetes.io/projected/e97b319e-c5d9-4c32-bd4a-a6c72ba763e9-kube-api-access-b4mw8\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.165237 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.267030 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.267121 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mw8\" (UniqueName: \"kubernetes.io/projected/e97b319e-c5d9-4c32-bd4a-a6c72ba763e9-kube-api-access-b4mw8\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.271861 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.271915 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/685e39039baebec7fd7db50b475d8d0fe067bb8f333a6b1e7169952b1a6676f2/globalmount\"" pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.297836 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mw8\" (UniqueName: \"kubernetes.io/projected/e97b319e-c5d9-4c32-bd4a-a6c72ba763e9-kube-api-access-b4mw8\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.324995 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8302756d-ccc6-4706-84b3-6f80a4a0618d\") pod \"minio\" (UID: \"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9\") " pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.355533 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.442867 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.444921 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.451926 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.573343 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.573462 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbnk\" (UniqueName: \"kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.573505 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.674238 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbnk\" (UniqueName: \"kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.674602 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.674627 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.675153 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.675423 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.696615 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbnk\" (UniqueName: \"kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk\") pod \"redhat-operators-2mg97\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.775819 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:05 crc kubenswrapper[4777]: I0216 21:51:05.952241 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 16 21:51:06 crc kubenswrapper[4777]: I0216 21:51:06.007894 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:06 crc kubenswrapper[4777]: I0216 21:51:06.036577 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9","Type":"ContainerStarted","Data":"9014d4b103a95cbd78e7528bcf456c9193beaa22a77d7d7410b39563c2fdb0ff"} Feb 16 21:51:06 crc kubenswrapper[4777]: I0216 21:51:06.039704 4777 generic.go:334] "Generic (PLEG): container finished" podID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerID="f08410f42706c1af664107c002815041040d5400e8f3be3068b220fad5f04e7d" exitCode=0 Feb 16 21:51:06 crc kubenswrapper[4777]: I0216 21:51:06.039756 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" event={"ID":"007313eb-feab-4b33-a5f7-d0f6e842d722","Type":"ContainerDied","Data":"f08410f42706c1af664107c002815041040d5400e8f3be3068b220fad5f04e7d"} Feb 16 21:51:07 crc kubenswrapper[4777]: I0216 21:51:07.053057 4777 generic.go:334] "Generic (PLEG): container finished" podID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerID="5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7" exitCode=0 Feb 16 21:51:07 crc kubenswrapper[4777]: I0216 21:51:07.053145 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerDied","Data":"5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7"} Feb 16 21:51:07 crc kubenswrapper[4777]: I0216 21:51:07.053385 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerStarted","Data":"d74cf0a5967faaf3f4df0045d42a85b50009e1c65ff60d110794bdcc892ed9d0"} Feb 16 21:51:07 crc kubenswrapper[4777]: I0216 21:51:07.062695 4777 generic.go:334] "Generic (PLEG): container finished" podID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerID="4c8827c36c1908d583321966a8c4967c12e17bfc539e6b632898ec4ec666fd5d" exitCode=0 Feb 16 21:51:07 crc kubenswrapper[4777]: I0216 21:51:07.062751 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" event={"ID":"007313eb-feab-4b33-a5f7-d0f6e842d722","Type":"ContainerDied","Data":"4c8827c36c1908d583321966a8c4967c12e17bfc539e6b632898ec4ec666fd5d"} Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.825587 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.920666 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util\") pod \"007313eb-feab-4b33-a5f7-d0f6e842d722\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.920799 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle\") pod \"007313eb-feab-4b33-a5f7-d0f6e842d722\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.920943 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx5w8\" (UniqueName: \"kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8\") pod \"007313eb-feab-4b33-a5f7-d0f6e842d722\" (UID: \"007313eb-feab-4b33-a5f7-d0f6e842d722\") " Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.921963 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle" (OuterVolumeSpecName: "bundle") pod "007313eb-feab-4b33-a5f7-d0f6e842d722" (UID: "007313eb-feab-4b33-a5f7-d0f6e842d722"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.939143 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8" (OuterVolumeSpecName: "kube-api-access-vx5w8") pod "007313eb-feab-4b33-a5f7-d0f6e842d722" (UID: "007313eb-feab-4b33-a5f7-d0f6e842d722"). InnerVolumeSpecName "kube-api-access-vx5w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:51:08 crc kubenswrapper[4777]: I0216 21:51:08.939619 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util" (OuterVolumeSpecName: "util") pod "007313eb-feab-4b33-a5f7-d0f6e842d722" (UID: "007313eb-feab-4b33-a5f7-d0f6e842d722"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.022775 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx5w8\" (UniqueName: \"kubernetes.io/projected/007313eb-feab-4b33-a5f7-d0f6e842d722-kube-api-access-vx5w8\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.022812 4777 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-util\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.022823 4777 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/007313eb-feab-4b33-a5f7-d0f6e842d722-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.080062 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" event={"ID":"007313eb-feab-4b33-a5f7-d0f6e842d722","Type":"ContainerDied","Data":"59a0098563edc5b8257a933bcbdb5806a22e8f6337be607f00141d93c02b5682"} Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.080112 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59a0098563edc5b8257a933bcbdb5806a22e8f6337be607f00141d93c02b5682" Feb 16 21:51:09 crc kubenswrapper[4777]: I0216 21:51:09.080170 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78" Feb 16 21:51:10 crc kubenswrapper[4777]: I0216 21:51:10.089781 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerStarted","Data":"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4"} Feb 16 21:51:10 crc kubenswrapper[4777]: I0216 21:51:10.090853 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e97b319e-c5d9-4c32-bd4a-a6c72ba763e9","Type":"ContainerStarted","Data":"bbbc2c1f2c57ec17629cde3e24cbda872c9aafe5a639246b507899a4d86630d2"} Feb 16 21:51:10 crc kubenswrapper[4777]: I0216 21:51:10.134805 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.483384867 podStartE2EDuration="8.134780425s" podCreationTimestamp="2026-02-16 21:51:02 +0000 UTC" firstStartedPulling="2026-02-16 21:51:05.967704069 +0000 UTC m=+786.550205171" lastFinishedPulling="2026-02-16 21:51:09.619099597 +0000 UTC m=+790.201600729" observedRunningTime="2026-02-16 21:51:10.131888944 +0000 UTC m=+790.714390056" watchObservedRunningTime="2026-02-16 21:51:10.134780425 +0000 UTC m=+790.717281527" Feb 16 21:51:11 crc kubenswrapper[4777]: I0216 21:51:11.098017 4777 generic.go:334] "Generic (PLEG): container finished" podID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerID="4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4" exitCode=0 Feb 16 21:51:11 crc kubenswrapper[4777]: I0216 21:51:11.098058 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerDied","Data":"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4"} Feb 16 21:51:12 crc kubenswrapper[4777]: I0216 21:51:12.106115 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerStarted","Data":"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130"} Feb 16 21:51:12 crc kubenswrapper[4777]: I0216 21:51:12.127632 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2mg97" podStartSLOduration=3.008024955 podStartE2EDuration="7.127611424s" podCreationTimestamp="2026-02-16 21:51:05 +0000 UTC" firstStartedPulling="2026-02-16 21:51:07.383816694 +0000 UTC m=+787.966317796" lastFinishedPulling="2026-02-16 21:51:11.503403143 +0000 UTC m=+792.085904265" observedRunningTime="2026-02-16 21:51:12.123312584 +0000 UTC m=+792.705813716" watchObservedRunningTime="2026-02-16 21:51:12.127611424 +0000 UTC m=+792.710112526" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.482268 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd"] Feb 16 21:51:15 crc kubenswrapper[4777]: E0216 21:51:15.482794 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="extract" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.482811 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="extract" Feb 16 21:51:15 crc kubenswrapper[4777]: E0216 21:51:15.482831 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="pull" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.482842 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="pull" Feb 16 21:51:15 crc kubenswrapper[4777]: E0216 21:51:15.482863 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="util" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.482872 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="util" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.483005 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="007313eb-feab-4b33-a5f7-d0f6e842d722" containerName="extract" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.483861 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.486799 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.486831 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.486838 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-wnlfm" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.486884 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.486884 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.487013 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.500369 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd"] Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.544387 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-manager-config\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.544697 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.544839 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8th\" (UniqueName: \"kubernetes.io/projected/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-kube-api-access-xw8th\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.545001 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-webhook-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.545133 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-apiservice-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.646844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-manager-config\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.646906 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.646931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8th\" (UniqueName: \"kubernetes.io/projected/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-kube-api-access-xw8th\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.647775 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-webhook-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.647804 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-apiservice-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.647940 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-manager-config\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.653844 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.656343 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-apiservice-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.656878 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-webhook-cert\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.695486 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8th\" (UniqueName: \"kubernetes.io/projected/94269fb5-9977-4c2c-aa87-1fe5a841dc8d-kube-api-access-xw8th\") pod \"loki-operator-controller-manager-5b545f4d58-l9tqd\" (UID: \"94269fb5-9977-4c2c-aa87-1fe5a841dc8d\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.775993 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.776469 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:15 crc kubenswrapper[4777]: I0216 21:51:15.799899 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:16 crc kubenswrapper[4777]: I0216 21:51:16.150451 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd"] Feb 16 21:51:16 crc kubenswrapper[4777]: I0216 21:51:16.821929 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2mg97" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="registry-server" probeResult="failure" output=< Feb 16 21:51:16 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 21:51:16 crc kubenswrapper[4777]: > Feb 16 21:51:17 crc kubenswrapper[4777]: I0216 21:51:17.132291 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" event={"ID":"94269fb5-9977-4c2c-aa87-1fe5a841dc8d","Type":"ContainerStarted","Data":"5525e74c90e1c661fae2dc31ff7381bbfe288dc7b7c8ca491e319a127ac7938b"} Feb 16 21:51:22 crc kubenswrapper[4777]: I0216 21:51:22.164391 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" event={"ID":"94269fb5-9977-4c2c-aa87-1fe5a841dc8d","Type":"ContainerStarted","Data":"ae420920eb2fbae96b67d2f415c99097aca28375a3824c4992e0b916c5dadd57"} Feb 16 21:51:25 crc kubenswrapper[4777]: I0216 21:51:25.863166 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:25 crc kubenswrapper[4777]: I0216 21:51:25.917890 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:26 crc kubenswrapper[4777]: I0216 21:51:26.824747 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.415262 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" event={"ID":"94269fb5-9977-4c2c-aa87-1fe5a841dc8d","Type":"ContainerStarted","Data":"ca545888bac91457d7f720f4e12ba98904bb07941b82c81e16dd35c437a18f26"} Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.415418 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2mg97" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="registry-server" containerID="cri-o://77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130" gracePeriod=2 Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.442822 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" podStartSLOduration=1.434510099 podStartE2EDuration="12.442782024s" podCreationTimestamp="2026-02-16 21:51:15 +0000 UTC" firstStartedPulling="2026-02-16 21:51:16.146606775 +0000 UTC m=+796.729107877" lastFinishedPulling="2026-02-16 21:51:27.1548787 +0000 UTC m=+807.737379802" observedRunningTime="2026-02-16 21:51:27.439467021 +0000 UTC m=+808.021968133" watchObservedRunningTime="2026-02-16 21:51:27.442782024 +0000 UTC m=+808.025283126" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.783266 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.819561 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content\") pod \"b7827d0f-4b87-44d4-84b8-48d08cd827df\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.819631 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities\") pod \"b7827d0f-4b87-44d4-84b8-48d08cd827df\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.819652 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqbnk\" (UniqueName: \"kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk\") pod \"b7827d0f-4b87-44d4-84b8-48d08cd827df\" (UID: \"b7827d0f-4b87-44d4-84b8-48d08cd827df\") " Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.821458 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities" (OuterVolumeSpecName: "utilities") pod "b7827d0f-4b87-44d4-84b8-48d08cd827df" (UID: "b7827d0f-4b87-44d4-84b8-48d08cd827df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.828816 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk" (OuterVolumeSpecName: "kube-api-access-hqbnk") pod "b7827d0f-4b87-44d4-84b8-48d08cd827df" (UID: "b7827d0f-4b87-44d4-84b8-48d08cd827df"). InnerVolumeSpecName "kube-api-access-hqbnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.921331 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.921383 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqbnk\" (UniqueName: \"kubernetes.io/projected/b7827d0f-4b87-44d4-84b8-48d08cd827df-kube-api-access-hqbnk\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:27 crc kubenswrapper[4777]: I0216 21:51:27.943396 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7827d0f-4b87-44d4-84b8-48d08cd827df" (UID: "b7827d0f-4b87-44d4-84b8-48d08cd827df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.022386 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7827d0f-4b87-44d4-84b8-48d08cd827df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.426913 4777 generic.go:334] "Generic (PLEG): container finished" podID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerID="77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130" exitCode=0 Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.427286 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mg97" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.428328 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerDied","Data":"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130"} Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.428371 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mg97" event={"ID":"b7827d0f-4b87-44d4-84b8-48d08cd827df","Type":"ContainerDied","Data":"d74cf0a5967faaf3f4df0045d42a85b50009e1c65ff60d110794bdcc892ed9d0"} Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.428392 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.428418 4777 scope.go:117] "RemoveContainer" containerID="77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.430388 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5b545f4d58-l9tqd" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.450032 4777 scope.go:117] "RemoveContainer" containerID="4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.475325 4777 scope.go:117] "RemoveContainer" containerID="5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.494940 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.499653 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2mg97"] Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.517630 4777 scope.go:117] "RemoveContainer" containerID="77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130" Feb 16 21:51:28 crc kubenswrapper[4777]: E0216 21:51:28.518223 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130\": container with ID starting with 77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130 not found: ID does not exist" containerID="77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.518369 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130"} err="failed to get container status \"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130\": rpc error: code = NotFound desc = could not find container \"77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130\": container with ID starting with 77ef17b87fabee5a3ff2cba53cbe8a17ae2ec3da9236a317141f5597185b2130 not found: ID does not exist" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.518477 4777 scope.go:117] "RemoveContainer" containerID="4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4" Feb 16 21:51:28 crc kubenswrapper[4777]: E0216 21:51:28.518984 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4\": container with ID starting with 4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4 not found: ID does not exist" containerID="4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.519037 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4"} err="failed to get container status \"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4\": rpc error: code = NotFound desc = could not find container \"4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4\": container with ID starting with 4b9de8e06f4e6915ad8358f9cbd6ba87630e7488dbdaeb3bb2bee421641a29c4 not found: ID does not exist" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.519080 4777 scope.go:117] "RemoveContainer" containerID="5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7" Feb 16 21:51:28 crc kubenswrapper[4777]: E0216 21:51:28.519412 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7\": container with ID starting with 5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7 not found: ID does not exist" containerID="5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7" Feb 16 21:51:28 crc kubenswrapper[4777]: I0216 21:51:28.519450 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7"} err="failed to get container status \"5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7\": rpc error: code = NotFound desc = could not find container \"5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7\": container with ID starting with 5f94d0ba6e5cc760336dd4539c74bb0d25a2e3a2bcc9ce291f0f0e6a2be876a7 not found: ID does not exist" Feb 16 21:51:30 crc kubenswrapper[4777]: I0216 21:51:30.193363 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" path="/var/lib/kubelet/pods/b7827d0f-4b87-44d4-84b8-48d08cd827df/volumes" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.782656 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5"] Feb 16 21:52:01 crc kubenswrapper[4777]: E0216 21:52:01.783654 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="extract-utilities" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.783676 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="extract-utilities" Feb 16 21:52:01 crc kubenswrapper[4777]: E0216 21:52:01.783705 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="extract-content" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.783739 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="extract-content" Feb 16 21:52:01 crc kubenswrapper[4777]: E0216 21:52:01.783762 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="registry-server" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.783774 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="registry-server" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.783950 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7827d0f-4b87-44d4-84b8-48d08cd827df" containerName="registry-server" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.785657 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.788781 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.792968 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5"] Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.923487 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.923901 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw95m\" (UniqueName: \"kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:01 crc kubenswrapper[4777]: I0216 21:52:01.923959 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.024482 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw95m\" (UniqueName: \"kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.024583 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.024647 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.025437 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.025637 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.053536 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw95m\" (UniqueName: \"kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.104075 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.593945 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5"] Feb 16 21:52:02 crc kubenswrapper[4777]: I0216 21:52:02.683775 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" event={"ID":"e7e0de6f-7050-4f09-b06d-ccab19191b6d","Type":"ContainerStarted","Data":"03aa206df552e7443a580fe7b70de56132f2a106a424cb2fded8e1f4046df0f4"} Feb 16 21:52:03 crc kubenswrapper[4777]: I0216 21:52:03.694513 4777 generic.go:334] "Generic (PLEG): container finished" podID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerID="4707c4f1b7b8e9777d5cc1648d1c92829d5a42b4b2372ce8b21fa1ccebfddf6e" exitCode=0 Feb 16 21:52:03 crc kubenswrapper[4777]: I0216 21:52:03.695030 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" event={"ID":"e7e0de6f-7050-4f09-b06d-ccab19191b6d","Type":"ContainerDied","Data":"4707c4f1b7b8e9777d5cc1648d1c92829d5a42b4b2372ce8b21fa1ccebfddf6e"} Feb 16 21:52:05 crc kubenswrapper[4777]: I0216 21:52:05.712223 4777 generic.go:334] "Generic (PLEG): container finished" podID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerID="65e943b47017f2c08cb86bf7098ca6d5fbd83b1c9dc5befdb42cd05d84190461" exitCode=0 Feb 16 21:52:05 crc kubenswrapper[4777]: I0216 21:52:05.712484 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" event={"ID":"e7e0de6f-7050-4f09-b06d-ccab19191b6d","Type":"ContainerDied","Data":"65e943b47017f2c08cb86bf7098ca6d5fbd83b1c9dc5befdb42cd05d84190461"} Feb 16 21:52:06 crc kubenswrapper[4777]: I0216 21:52:06.724236 4777 generic.go:334] "Generic (PLEG): container finished" podID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerID="54a5176d73495a0b00f56e05bbc87ee0857330c4bf9f0d233a6baaf52942cef0" exitCode=0 Feb 16 21:52:06 crc kubenswrapper[4777]: I0216 21:52:06.724340 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" event={"ID":"e7e0de6f-7050-4f09-b06d-ccab19191b6d","Type":"ContainerDied","Data":"54a5176d73495a0b00f56e05bbc87ee0857330c4bf9f0d233a6baaf52942cef0"} Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.047572 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.211144 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle\") pod \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.211240 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util\") pod \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.211309 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw95m\" (UniqueName: \"kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m\") pod \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\" (UID: \"e7e0de6f-7050-4f09-b06d-ccab19191b6d\") " Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.212157 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle" (OuterVolumeSpecName: "bundle") pod "e7e0de6f-7050-4f09-b06d-ccab19191b6d" (UID: "e7e0de6f-7050-4f09-b06d-ccab19191b6d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.220964 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m" (OuterVolumeSpecName: "kube-api-access-kw95m") pod "e7e0de6f-7050-4f09-b06d-ccab19191b6d" (UID: "e7e0de6f-7050-4f09-b06d-ccab19191b6d"). InnerVolumeSpecName "kube-api-access-kw95m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.234522 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util" (OuterVolumeSpecName: "util") pod "e7e0de6f-7050-4f09-b06d-ccab19191b6d" (UID: "e7e0de6f-7050-4f09-b06d-ccab19191b6d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.312571 4777 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-util\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.312619 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw95m\" (UniqueName: \"kubernetes.io/projected/e7e0de6f-7050-4f09-b06d-ccab19191b6d-kube-api-access-kw95m\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.312633 4777 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7e0de6f-7050-4f09-b06d-ccab19191b6d-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.743748 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" event={"ID":"e7e0de6f-7050-4f09-b06d-ccab19191b6d","Type":"ContainerDied","Data":"03aa206df552e7443a580fe7b70de56132f2a106a424cb2fded8e1f4046df0f4"} Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.744147 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03aa206df552e7443a580fe7b70de56132f2a106a424cb2fded8e1f4046df0f4" Feb 16 21:52:08 crc kubenswrapper[4777]: I0216 21:52:08.743845 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.002691 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2mzfn"] Feb 16 21:52:11 crc kubenswrapper[4777]: E0216 21:52:11.003124 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="pull" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.003147 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="pull" Feb 16 21:52:11 crc kubenswrapper[4777]: E0216 21:52:11.003190 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="util" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.003202 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="util" Feb 16 21:52:11 crc kubenswrapper[4777]: E0216 21:52:11.003218 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="extract" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.003234 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="extract" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.003452 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e0de6f-7050-4f09-b06d-ccab19191b6d" containerName="extract" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.004191 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.006879 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.008956 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-blk4k" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.010386 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.113601 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2mzfn"] Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.150327 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhq5n\" (UniqueName: \"kubernetes.io/projected/cac8ca17-cdb3-4745-a133-f197869914b4-kube-api-access-nhq5n\") pod \"nmstate-operator-694c9596b7-2mzfn\" (UID: \"cac8ca17-cdb3-4745-a133-f197869914b4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.251910 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhq5n\" (UniqueName: \"kubernetes.io/projected/cac8ca17-cdb3-4745-a133-f197869914b4-kube-api-access-nhq5n\") pod \"nmstate-operator-694c9596b7-2mzfn\" (UID: \"cac8ca17-cdb3-4745-a133-f197869914b4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.280488 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhq5n\" (UniqueName: \"kubernetes.io/projected/cac8ca17-cdb3-4745-a133-f197869914b4-kube-api-access-nhq5n\") pod \"nmstate-operator-694c9596b7-2mzfn\" (UID: \"cac8ca17-cdb3-4745-a133-f197869914b4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.332302 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.556956 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2mzfn"] Feb 16 21:52:11 crc kubenswrapper[4777]: I0216 21:52:11.763047 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" event={"ID":"cac8ca17-cdb3-4745-a133-f197869914b4","Type":"ContainerStarted","Data":"bfed5ac026913be632402e6335ec95e897b59700316e4c074225b2d97e4101cf"} Feb 16 21:52:14 crc kubenswrapper[4777]: I0216 21:52:14.794822 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" event={"ID":"cac8ca17-cdb3-4745-a133-f197869914b4","Type":"ContainerStarted","Data":"137e3bb62fdf5c11dd08e0c7dde880678b18fd0e024d20a72e61ba73b5ad046b"} Feb 16 21:52:14 crc kubenswrapper[4777]: I0216 21:52:14.821212 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-2mzfn" podStartSLOduration=2.6781320429999997 podStartE2EDuration="4.821186998s" podCreationTimestamp="2026-02-16 21:52:10 +0000 UTC" firstStartedPulling="2026-02-16 21:52:11.564435816 +0000 UTC m=+852.146936918" lastFinishedPulling="2026-02-16 21:52:13.707490771 +0000 UTC m=+854.289991873" observedRunningTime="2026-02-16 21:52:14.814452389 +0000 UTC m=+855.396953511" watchObservedRunningTime="2026-02-16 21:52:14.821186998 +0000 UTC m=+855.403688110" Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.923522 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d969w"] Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.924866 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.929461 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-r6mw7" Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.987102 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr"] Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.987997 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.990790 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 16 21:52:15 crc kubenswrapper[4777]: I0216 21:52:15.998114 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d969w"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.003894 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-klpn2"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.004801 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.016270 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.016949 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7hg\" (UniqueName: \"kubernetes.io/projected/1572cd79-8c35-4985-ad71-685580878c04-kube-api-access-hd7hg\") pod \"nmstate-metrics-58c85c668d-d969w\" (UID: \"1572cd79-8c35-4985-ad71-685580878c04\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.087777 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.088488 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.092919 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jjxrx" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.092938 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.092935 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.106814 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.117655 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmprj\" (UniqueName: \"kubernetes.io/projected/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-kube-api-access-rmprj\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.117726 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl455\" (UniqueName: \"kubernetes.io/projected/5b920bcb-4d07-48db-b7bd-78ce2481fa08-kube-api-access-nl455\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.117931 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.117992 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-ovs-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.118070 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7hg\" (UniqueName: \"kubernetes.io/projected/1572cd79-8c35-4985-ad71-685580878c04-kube-api-access-hd7hg\") pod \"nmstate-metrics-58c85c668d-d969w\" (UID: \"1572cd79-8c35-4985-ad71-685580878c04\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.118135 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-dbus-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.118162 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-nmstate-lock\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.139305 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7hg\" (UniqueName: \"kubernetes.io/projected/1572cd79-8c35-4985-ad71-685580878c04-kube-api-access-hd7hg\") pod \"nmstate-metrics-58c85c668d-d969w\" (UID: \"1572cd79-8c35-4985-ad71-685580878c04\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219550 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthr7\" (UniqueName: \"kubernetes.io/projected/25b13f2e-0ccf-4c04-96ae-96da39ecf652-kube-api-access-dthr7\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219622 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219651 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-ovs-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219674 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25b13f2e-0ccf-4c04-96ae-96da39ecf652-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219730 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-dbus-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219746 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-nmstate-lock\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219771 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmprj\" (UniqueName: \"kubernetes.io/projected/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-kube-api-access-rmprj\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219794 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b13f2e-0ccf-4c04-96ae-96da39ecf652-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.219816 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl455\" (UniqueName: \"kubernetes.io/projected/5b920bcb-4d07-48db-b7bd-78ce2481fa08-kube-api-access-nl455\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.220786 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-ovs-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.220903 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-dbus-socket\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.221251 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5b920bcb-4d07-48db-b7bd-78ce2481fa08-nmstate-lock\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.227432 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.239870 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl455\" (UniqueName: \"kubernetes.io/projected/5b920bcb-4d07-48db-b7bd-78ce2481fa08-kube-api-access-nl455\") pod \"nmstate-handler-klpn2\" (UID: \"5b920bcb-4d07-48db-b7bd-78ce2481fa08\") " pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.241516 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.244482 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmprj\" (UniqueName: \"kubernetes.io/projected/1c793dcc-4939-49c3-b6e3-6866f23ae0ae-kube-api-access-rmprj\") pod \"nmstate-webhook-866bcb46dc-9b2zr\" (UID: \"1c793dcc-4939-49c3-b6e3-6866f23ae0ae\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.276779 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d954dfcbb-m776b"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.277482 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.288298 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d954dfcbb-m776b"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325536 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325599 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-service-ca\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325629 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-console-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325648 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmhf\" (UniqueName: \"kubernetes.io/projected/bd7123fb-24bb-4e06-b04c-874929f0c494-kube-api-access-mhmhf\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325677 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25b13f2e-0ccf-4c04-96ae-96da39ecf652-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325727 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-trusted-ca-bundle\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325746 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-oauth-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325776 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-oauth-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325798 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b13f2e-0ccf-4c04-96ae-96da39ecf652-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.325824 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dthr7\" (UniqueName: \"kubernetes.io/projected/25b13f2e-0ccf-4c04-96ae-96da39ecf652-kube-api-access-dthr7\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.326546 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.342514 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.346613 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/25b13f2e-0ccf-4c04-96ae-96da39ecf652-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.358144 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/25b13f2e-0ccf-4c04-96ae-96da39ecf652-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.375060 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dthr7\" (UniqueName: \"kubernetes.io/projected/25b13f2e-0ccf-4c04-96ae-96da39ecf652-kube-api-access-dthr7\") pod \"nmstate-console-plugin-5c78fc5d65-pvxhd\" (UID: \"25b13f2e-0ccf-4c04-96ae-96da39ecf652\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.402232 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427474 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-trusted-ca-bundle\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427815 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-oauth-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427847 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-oauth-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427884 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427914 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-service-ca\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427941 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-console-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.427960 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmhf\" (UniqueName: \"kubernetes.io/projected/bd7123fb-24bb-4e06-b04c-874929f0c494-kube-api-access-mhmhf\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.429501 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-service-ca\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.429541 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-trusted-ca-bundle\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.429911 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-console-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.430136 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd7123fb-24bb-4e06-b04c-874929f0c494-oauth-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.434098 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-oauth-config\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.438966 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd7123fb-24bb-4e06-b04c-874929f0c494-console-serving-cert\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.450999 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmhf\" (UniqueName: \"kubernetes.io/projected/bd7123fb-24bb-4e06-b04c-874929f0c494-kube-api-access-mhmhf\") pod \"console-7d954dfcbb-m776b\" (UID: \"bd7123fb-24bb-4e06-b04c-874929f0c494\") " pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.533419 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d969w"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.662901 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.674086 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.826670 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-klpn2" event={"ID":"5b920bcb-4d07-48db-b7bd-78ce2481fa08","Type":"ContainerStarted","Data":"5e49fe3392f0871dd4ad6ce02c6bbc8a7519f5bce2d9be9c0f2f4c0bb0788005"} Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.831126 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr"] Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.835631 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" event={"ID":"1572cd79-8c35-4985-ad71-685580878c04","Type":"ContainerStarted","Data":"bccf1785eb1eb20955be2144c324e0e1e6d5cd0d9f3f105238816c0285ca8c39"} Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.837279 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" event={"ID":"25b13f2e-0ccf-4c04-96ae-96da39ecf652","Type":"ContainerStarted","Data":"26b53832e9b65e917009300a606670db05f7ce29d74f1ee8a9a42141da9d6e27"} Feb 16 21:52:16 crc kubenswrapper[4777]: W0216 21:52:16.842604 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c793dcc_4939_49c3_b6e3_6866f23ae0ae.slice/crio-cfdbb968582c6e6b56a2a296dfab34649c8ebbbda756d4390f328fd2fc1aeb14 WatchSource:0}: Error finding container cfdbb968582c6e6b56a2a296dfab34649c8ebbbda756d4390f328fd2fc1aeb14: Status 404 returned error can't find the container with id cfdbb968582c6e6b56a2a296dfab34649c8ebbbda756d4390f328fd2fc1aeb14 Feb 16 21:52:16 crc kubenswrapper[4777]: I0216 21:52:16.927289 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d954dfcbb-m776b"] Feb 16 21:52:16 crc kubenswrapper[4777]: W0216 21:52:16.931057 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7123fb_24bb_4e06_b04c_874929f0c494.slice/crio-d4508a61f0fa9448e952e100f41e429e73df91024831c72198aa1144d9a0c06a WatchSource:0}: Error finding container d4508a61f0fa9448e952e100f41e429e73df91024831c72198aa1144d9a0c06a: Status 404 returned error can't find the container with id d4508a61f0fa9448e952e100f41e429e73df91024831c72198aa1144d9a0c06a Feb 16 21:52:17 crc kubenswrapper[4777]: I0216 21:52:17.848060 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" event={"ID":"1c793dcc-4939-49c3-b6e3-6866f23ae0ae","Type":"ContainerStarted","Data":"cfdbb968582c6e6b56a2a296dfab34649c8ebbbda756d4390f328fd2fc1aeb14"} Feb 16 21:52:17 crc kubenswrapper[4777]: I0216 21:52:17.849585 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d954dfcbb-m776b" event={"ID":"bd7123fb-24bb-4e06-b04c-874929f0c494","Type":"ContainerStarted","Data":"60cc4b6ba8eae110f82a11ab3cd1e767b4b6b6d10b5047433e6463b94bd5936d"} Feb 16 21:52:17 crc kubenswrapper[4777]: I0216 21:52:17.849634 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d954dfcbb-m776b" event={"ID":"bd7123fb-24bb-4e06-b04c-874929f0c494","Type":"ContainerStarted","Data":"d4508a61f0fa9448e952e100f41e429e73df91024831c72198aa1144d9a0c06a"} Feb 16 21:52:17 crc kubenswrapper[4777]: I0216 21:52:17.872505 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d954dfcbb-m776b" podStartSLOduration=1.872482789 podStartE2EDuration="1.872482789s" podCreationTimestamp="2026-02-16 21:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:52:17.867388917 +0000 UTC m=+858.449890029" watchObservedRunningTime="2026-02-16 21:52:17.872482789 +0000 UTC m=+858.454983891" Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.880164 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" event={"ID":"1c793dcc-4939-49c3-b6e3-6866f23ae0ae","Type":"ContainerStarted","Data":"24e44a6035ba8e88415ec4b19ec2440f32cbe3a7741bb216a53192b9c8ffd242"} Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.881150 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.883845 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-klpn2" event={"ID":"5b920bcb-4d07-48db-b7bd-78ce2481fa08","Type":"ContainerStarted","Data":"84335e467a1149cba687faf423f5ba9e28a44c634ea4897ec8001056927e0e7e"} Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.884015 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.886632 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" event={"ID":"1572cd79-8c35-4985-ad71-685580878c04","Type":"ContainerStarted","Data":"eabc9794c20f496e50d735e93095fdf0f89ebfbc7d7bdbd37e1e8f73611c022d"} Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.890436 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" event={"ID":"25b13f2e-0ccf-4c04-96ae-96da39ecf652","Type":"ContainerStarted","Data":"93540788dcd31c451074576d09fe96e02a901ad0059dd87aa64c0183a92d8ec0"} Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.935222 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" podStartSLOduration=2.9364848070000003 podStartE2EDuration="5.93519238s" podCreationTimestamp="2026-02-16 21:52:15 +0000 UTC" firstStartedPulling="2026-02-16 21:52:16.847949544 +0000 UTC m=+857.430450646" lastFinishedPulling="2026-02-16 21:52:19.846657117 +0000 UTC m=+860.429158219" observedRunningTime="2026-02-16 21:52:20.91588278 +0000 UTC m=+861.498383892" watchObservedRunningTime="2026-02-16 21:52:20.93519238 +0000 UTC m=+861.517693522" Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.940753 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-klpn2" podStartSLOduration=2.468526983 podStartE2EDuration="5.940707064s" podCreationTimestamp="2026-02-16 21:52:15 +0000 UTC" firstStartedPulling="2026-02-16 21:52:16.392935732 +0000 UTC m=+856.975436834" lastFinishedPulling="2026-02-16 21:52:19.865115813 +0000 UTC m=+860.447616915" observedRunningTime="2026-02-16 21:52:20.93342361 +0000 UTC m=+861.515924742" watchObservedRunningTime="2026-02-16 21:52:20.940707064 +0000 UTC m=+861.523208176" Feb 16 21:52:20 crc kubenswrapper[4777]: I0216 21:52:20.961555 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-pvxhd" podStartSLOduration=1.801593178 podStartE2EDuration="4.961518425s" podCreationTimestamp="2026-02-16 21:52:16 +0000 UTC" firstStartedPulling="2026-02-16 21:52:16.682006058 +0000 UTC m=+857.264507160" lastFinishedPulling="2026-02-16 21:52:19.841931305 +0000 UTC m=+860.424432407" observedRunningTime="2026-02-16 21:52:20.951122465 +0000 UTC m=+861.533623597" watchObservedRunningTime="2026-02-16 21:52:20.961518425 +0000 UTC m=+861.544019567" Feb 16 21:52:22 crc kubenswrapper[4777]: I0216 21:52:22.912169 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" event={"ID":"1572cd79-8c35-4985-ad71-685580878c04","Type":"ContainerStarted","Data":"f579de0caf5a4699405c83c03a8b6e3e6604203bfa9f798087edbbb1ffdb60c2"} Feb 16 21:52:22 crc kubenswrapper[4777]: I0216 21:52:22.947832 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d969w" podStartSLOduration=2.093215667 podStartE2EDuration="7.947791541s" podCreationTimestamp="2026-02-16 21:52:15 +0000 UTC" firstStartedPulling="2026-02-16 21:52:16.536682048 +0000 UTC m=+857.119183150" lastFinishedPulling="2026-02-16 21:52:22.391257922 +0000 UTC m=+862.973759024" observedRunningTime="2026-02-16 21:52:22.939583272 +0000 UTC m=+863.522084414" watchObservedRunningTime="2026-02-16 21:52:22.947791541 +0000 UTC m=+863.530292693" Feb 16 21:52:26 crc kubenswrapper[4777]: I0216 21:52:26.372307 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-klpn2" Feb 16 21:52:26 crc kubenswrapper[4777]: I0216 21:52:26.663789 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:26 crc kubenswrapper[4777]: I0216 21:52:26.664223 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:26 crc kubenswrapper[4777]: I0216 21:52:26.673987 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:26 crc kubenswrapper[4777]: I0216 21:52:26.949203 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d954dfcbb-m776b" Feb 16 21:52:27 crc kubenswrapper[4777]: I0216 21:52:27.024999 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:52:36 crc kubenswrapper[4777]: I0216 21:52:36.353540 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9b2zr" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.080419 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rxnqn" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" containerID="cri-o://71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820" gracePeriod=15 Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.300709 4777 patch_prober.go:28] interesting pod/console-f9d7485db-rxnqn container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.301056 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-rxnqn" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.541270 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxnqn_7575698b-ab60-49ed-9d95-744b540314a5/console/0.log" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.541385 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.613797 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614327 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614381 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614434 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614470 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27r2r\" (UniqueName: \"kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614498 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614613 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config\") pod \"7575698b-ab60-49ed-9d95-744b540314a5\" (UID: \"7575698b-ab60-49ed-9d95-744b540314a5\") " Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.614993 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.615649 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config" (OuterVolumeSpecName: "console-config") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.616753 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.617368 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca" (OuterVolumeSpecName: "service-ca") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.623271 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.624675 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r" (OuterVolumeSpecName: "kube-api-access-27r2r") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "kube-api-access-27r2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.625379 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7575698b-ab60-49ed-9d95-744b540314a5" (UID: "7575698b-ab60-49ed-9d95-744b540314a5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716394 4777 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-console-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716443 4777 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716458 4777 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716471 4777 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716486 4777 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7575698b-ab60-49ed-9d95-744b540314a5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716500 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27r2r\" (UniqueName: \"kubernetes.io/projected/7575698b-ab60-49ed-9d95-744b540314a5-kube-api-access-27r2r\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:52 crc kubenswrapper[4777]: I0216 21:52:52.716514 4777 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7575698b-ab60-49ed-9d95-744b540314a5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.203932 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxnqn_7575698b-ab60-49ed-9d95-744b540314a5/console/0.log" Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.203994 4777 generic.go:334] "Generic (PLEG): container finished" podID="7575698b-ab60-49ed-9d95-744b540314a5" containerID="71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820" exitCode=2 Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.204036 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxnqn" event={"ID":"7575698b-ab60-49ed-9d95-744b540314a5","Type":"ContainerDied","Data":"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820"} Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.204070 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxnqn" event={"ID":"7575698b-ab60-49ed-9d95-744b540314a5","Type":"ContainerDied","Data":"fb6f1eae3841a289e5e9643c141a1605b01a28f7b06e6965ef4a3e3dc48aae83"} Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.204091 4777 scope.go:117] "RemoveContainer" containerID="71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820" Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.204261 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxnqn" Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.245915 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.253274 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rxnqn"] Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.256771 4777 scope.go:117] "RemoveContainer" containerID="71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820" Feb 16 21:52:53 crc kubenswrapper[4777]: E0216 21:52:53.259181 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820\": container with ID starting with 71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820 not found: ID does not exist" containerID="71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820" Feb 16 21:52:53 crc kubenswrapper[4777]: I0216 21:52:53.259224 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820"} err="failed to get container status \"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820\": rpc error: code = NotFound desc = could not find container \"71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820\": container with ID starting with 71c63cfc32524851fcef0a7d18280ddd974840c69fdf4038ca8f485414ea6820 not found: ID does not exist" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.158754 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t"] Feb 16 21:52:54 crc kubenswrapper[4777]: E0216 21:52:54.159466 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.159496 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.159692 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="7575698b-ab60-49ed-9d95-744b540314a5" containerName="console" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.161278 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.164235 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.169608 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t"] Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.194268 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7575698b-ab60-49ed-9d95-744b540314a5" path="/var/lib/kubelet/pods/7575698b-ab60-49ed-9d95-744b540314a5/volumes" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.239779 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.240064 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.240137 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nbv\" (UniqueName: \"kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.342831 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.342986 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.343052 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nbv\" (UniqueName: \"kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.344257 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.344698 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.376852 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nbv\" (UniqueName: \"kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.487224 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:52:54 crc kubenswrapper[4777]: I0216 21:52:54.939944 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t"] Feb 16 21:52:55 crc kubenswrapper[4777]: I0216 21:52:55.222175 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerStarted","Data":"17aced3deeaecdd574b135ce308ddd4284b70b41b23ec7a8ac378efb31bac0f3"} Feb 16 21:52:55 crc kubenswrapper[4777]: I0216 21:52:55.222252 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerStarted","Data":"11a4e5a0155699c7376b159acb7b070bf1e03fb0f54d4ec211b8bbb9c561b007"} Feb 16 21:52:56 crc kubenswrapper[4777]: I0216 21:52:56.234468 4777 generic.go:334] "Generic (PLEG): container finished" podID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerID="17aced3deeaecdd574b135ce308ddd4284b70b41b23ec7a8ac378efb31bac0f3" exitCode=0 Feb 16 21:52:56 crc kubenswrapper[4777]: I0216 21:52:56.234612 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerDied","Data":"17aced3deeaecdd574b135ce308ddd4284b70b41b23ec7a8ac378efb31bac0f3"} Feb 16 21:52:58 crc kubenswrapper[4777]: I0216 21:52:58.251820 4777 generic.go:334] "Generic (PLEG): container finished" podID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerID="fbb9e897e291a403ba232e340d545aede7259f068314198d7026eb432760dc4c" exitCode=0 Feb 16 21:52:58 crc kubenswrapper[4777]: I0216 21:52:58.251909 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerDied","Data":"fbb9e897e291a403ba232e340d545aede7259f068314198d7026eb432760dc4c"} Feb 16 21:52:59 crc kubenswrapper[4777]: I0216 21:52:59.261979 4777 generic.go:334] "Generic (PLEG): container finished" podID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerID="54c1c1308b9c64d775109c3193bb96dc70dc54861d9fd9d32408c74cf323732a" exitCode=0 Feb 16 21:52:59 crc kubenswrapper[4777]: I0216 21:52:59.262080 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerDied","Data":"54c1c1308b9c64d775109c3193bb96dc70dc54861d9fd9d32408c74cf323732a"} Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.593299 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.638560 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle\") pod \"7e933fc5-7959-4f15-a628-f8f812aa6eae\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.639044 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util\") pod \"7e933fc5-7959-4f15-a628-f8f812aa6eae\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.639106 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7nbv\" (UniqueName: \"kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv\") pod \"7e933fc5-7959-4f15-a628-f8f812aa6eae\" (UID: \"7e933fc5-7959-4f15-a628-f8f812aa6eae\") " Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.639953 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle" (OuterVolumeSpecName: "bundle") pod "7e933fc5-7959-4f15-a628-f8f812aa6eae" (UID: "7e933fc5-7959-4f15-a628-f8f812aa6eae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.645128 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv" (OuterVolumeSpecName: "kube-api-access-w7nbv") pod "7e933fc5-7959-4f15-a628-f8f812aa6eae" (UID: "7e933fc5-7959-4f15-a628-f8f812aa6eae"). InnerVolumeSpecName "kube-api-access-w7nbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.740008 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7nbv\" (UniqueName: \"kubernetes.io/projected/7e933fc5-7959-4f15-a628-f8f812aa6eae-kube-api-access-w7nbv\") on node \"crc\" DevicePath \"\"" Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.740047 4777 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:53:00 crc kubenswrapper[4777]: I0216 21:53:00.958839 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util" (OuterVolumeSpecName: "util") pod "7e933fc5-7959-4f15-a628-f8f812aa6eae" (UID: "7e933fc5-7959-4f15-a628-f8f812aa6eae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:53:01 crc kubenswrapper[4777]: I0216 21:53:01.043202 4777 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e933fc5-7959-4f15-a628-f8f812aa6eae-util\") on node \"crc\" DevicePath \"\"" Feb 16 21:53:01 crc kubenswrapper[4777]: I0216 21:53:01.277255 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" event={"ID":"7e933fc5-7959-4f15-a628-f8f812aa6eae","Type":"ContainerDied","Data":"11a4e5a0155699c7376b159acb7b070bf1e03fb0f54d4ec211b8bbb9c561b007"} Feb 16 21:53:01 crc kubenswrapper[4777]: I0216 21:53:01.277296 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a4e5a0155699c7376b159acb7b070bf1e03fb0f54d4ec211b8bbb9c561b007" Feb 16 21:53:01 crc kubenswrapper[4777]: I0216 21:53:01.277339 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.568001 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb"] Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.568615 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="util" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.568628 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="util" Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.568637 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="extract" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.568643 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="extract" Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.568663 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="pull" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.568669 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="pull" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.568778 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e933fc5-7959-4f15-a628-f8f812aa6eae" containerName="extract" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.569182 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.572370 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.572709 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ksk5x" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.573269 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.574827 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.575406 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.589528 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb"] Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.668888 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.669030 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjg9\" (UniqueName: \"kubernetes.io/projected/64191e35-ca2c-4f43-8590-792843cfd6b7-kube-api-access-qcjg9\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.669075 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-webhook-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.769943 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjg9\" (UniqueName: \"kubernetes.io/projected/64191e35-ca2c-4f43-8590-792843cfd6b7-kube-api-access-qcjg9\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.770229 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-webhook-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.771171 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.776451 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.776904 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64191e35-ca2c-4f43-8590-792843cfd6b7-webhook-cert\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.801280 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjg9\" (UniqueName: \"kubernetes.io/projected/64191e35-ca2c-4f43-8590-792843cfd6b7-kube-api-access-qcjg9\") pod \"metallb-operator-controller-manager-7b84c748cb-vs9wb\" (UID: \"64191e35-ca2c-4f43-8590-792843cfd6b7\") " pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.884471 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.895006 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq"] Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.895936 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:10 crc kubenswrapper[4777]: W0216 21:53:10.901569 4777 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.901633 4777 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 21:53:10 crc kubenswrapper[4777]: W0216 21:53:10.901583 4777 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.901671 4777 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 21:53:10 crc kubenswrapper[4777]: W0216 21:53:10.902060 4777 reflector.go:561] object-"metallb-system"/"controller-dockercfg-zxmpr": failed to list *v1.Secret: secrets "controller-dockercfg-zxmpr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Feb 16 21:53:10 crc kubenswrapper[4777]: E0216 21:53:10.902109 4777 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-zxmpr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-zxmpr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.930952 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq"] Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.973124 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjkb\" (UniqueName: \"kubernetes.io/projected/79b8e02d-175c-41be-8a12-6fb9c2da4107-kube-api-access-fpjkb\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.973202 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-webhook-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:10 crc kubenswrapper[4777]: I0216 21:53:10.973238 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-apiservice-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.073779 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-webhook-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.073817 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-apiservice-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.073900 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjkb\" (UniqueName: \"kubernetes.io/projected/79b8e02d-175c-41be-8a12-6fb9c2da4107-kube-api-access-fpjkb\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.106988 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjkb\" (UniqueName: \"kubernetes.io/projected/79b8e02d-175c-41be-8a12-6fb9c2da4107-kube-api-access-fpjkb\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.220000 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb"] Feb 16 21:53:11 crc kubenswrapper[4777]: W0216 21:53:11.226069 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64191e35_ca2c_4f43_8590_792843cfd6b7.slice/crio-9f6c3843a029866d87a683f7aa82d1730fff4fdd2bbe3f4d6f3ffde7771e79a2 WatchSource:0}: Error finding container 9f6c3843a029866d87a683f7aa82d1730fff4fdd2bbe3f4d6f3ffde7771e79a2: Status 404 returned error can't find the container with id 9f6c3843a029866d87a683f7aa82d1730fff4fdd2bbe3f4d6f3ffde7771e79a2 Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.343994 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" event={"ID":"64191e35-ca2c-4f43-8590-792843cfd6b7","Type":"ContainerStarted","Data":"9f6c3843a029866d87a683f7aa82d1730fff4fdd2bbe3f4d6f3ffde7771e79a2"} Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.652245 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.652338 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.737125 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zxmpr" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.894102 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.908847 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-apiservice-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:11 crc kubenswrapper[4777]: I0216 21:53:11.909104 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79b8e02d-175c-41be-8a12-6fb9c2da4107-webhook-cert\") pod \"metallb-operator-webhook-server-6895df4d58-8w2wq\" (UID: \"79b8e02d-175c-41be-8a12-6fb9c2da4107\") " pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:12 crc kubenswrapper[4777]: I0216 21:53:12.144871 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 16 21:53:12 crc kubenswrapper[4777]: I0216 21:53:12.168960 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:12 crc kubenswrapper[4777]: I0216 21:53:12.452858 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq"] Feb 16 21:53:12 crc kubenswrapper[4777]: W0216 21:53:12.460830 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b8e02d_175c_41be_8a12_6fb9c2da4107.slice/crio-e40ffb17e5b0a1b8d1f3df82c2f970a77937be36ad4bca8555778c5c18687501 WatchSource:0}: Error finding container e40ffb17e5b0a1b8d1f3df82c2f970a77937be36ad4bca8555778c5c18687501: Status 404 returned error can't find the container with id e40ffb17e5b0a1b8d1f3df82c2f970a77937be36ad4bca8555778c5c18687501 Feb 16 21:53:13 crc kubenswrapper[4777]: I0216 21:53:13.359411 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" event={"ID":"79b8e02d-175c-41be-8a12-6fb9c2da4107","Type":"ContainerStarted","Data":"e40ffb17e5b0a1b8d1f3df82c2f970a77937be36ad4bca8555778c5c18687501"} Feb 16 21:53:14 crc kubenswrapper[4777]: I0216 21:53:14.369473 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" event={"ID":"64191e35-ca2c-4f43-8590-792843cfd6b7","Type":"ContainerStarted","Data":"e9908b1a4ef84f59f795fc9fd8dabbfa676f6f06d6aec461279b4c113ddf7850"} Feb 16 21:53:14 crc kubenswrapper[4777]: I0216 21:53:14.369766 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:14 crc kubenswrapper[4777]: I0216 21:53:14.391565 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" podStartSLOduration=1.698544971 podStartE2EDuration="4.391547962s" podCreationTimestamp="2026-02-16 21:53:10 +0000 UTC" firstStartedPulling="2026-02-16 21:53:11.232133881 +0000 UTC m=+911.814634983" lastFinishedPulling="2026-02-16 21:53:13.925136832 +0000 UTC m=+914.507637974" observedRunningTime="2026-02-16 21:53:14.389530096 +0000 UTC m=+914.972031198" watchObservedRunningTime="2026-02-16 21:53:14.391547962 +0000 UTC m=+914.974049064" Feb 16 21:53:17 crc kubenswrapper[4777]: I0216 21:53:17.387071 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" event={"ID":"79b8e02d-175c-41be-8a12-6fb9c2da4107","Type":"ContainerStarted","Data":"a1971d71f79f2b44e05f571a6dfaec79f522c913683ec226a3a2631544209e30"} Feb 16 21:53:17 crc kubenswrapper[4777]: I0216 21:53:17.387903 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:17 crc kubenswrapper[4777]: I0216 21:53:17.406858 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" podStartSLOduration=2.801191619 podStartE2EDuration="7.406833632s" podCreationTimestamp="2026-02-16 21:53:10 +0000 UTC" firstStartedPulling="2026-02-16 21:53:12.464859647 +0000 UTC m=+913.047360749" lastFinishedPulling="2026-02-16 21:53:17.07050165 +0000 UTC m=+917.653002762" observedRunningTime="2026-02-16 21:53:17.404834856 +0000 UTC m=+917.987335988" watchObservedRunningTime="2026-02-16 21:53:17.406833632 +0000 UTC m=+917.989334774" Feb 16 21:53:32 crc kubenswrapper[4777]: I0216 21:53:32.177444 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6895df4d58-8w2wq" Feb 16 21:53:41 crc kubenswrapper[4777]: I0216 21:53:41.652533 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:53:41 crc kubenswrapper[4777]: I0216 21:53:41.653336 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.276802 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.278599 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.296648 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.380985 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzkz\" (UniqueName: \"kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.381075 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.381132 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.482057 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.482127 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.482227 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzkz\" (UniqueName: \"kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.483144 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.483183 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.512285 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzkz\" (UniqueName: \"kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz\") pod \"certified-operators-fd2pn\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:47 crc kubenswrapper[4777]: I0216 21:53:47.616644 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:48 crc kubenswrapper[4777]: I0216 21:53:48.180856 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:53:48 crc kubenswrapper[4777]: I0216 21:53:48.631618 4777 generic.go:334] "Generic (PLEG): container finished" podID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerID="bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a" exitCode=0 Feb 16 21:53:48 crc kubenswrapper[4777]: I0216 21:53:48.631749 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerDied","Data":"bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a"} Feb 16 21:53:48 crc kubenswrapper[4777]: I0216 21:53:48.632272 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerStarted","Data":"526482225a8b206ecd7c75947735872d9a48c218e3158289e864b3a3acb6f8d7"} Feb 16 21:53:49 crc kubenswrapper[4777]: I0216 21:53:49.645429 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerStarted","Data":"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08"} Feb 16 21:53:50 crc kubenswrapper[4777]: I0216 21:53:50.652067 4777 generic.go:334] "Generic (PLEG): container finished" podID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerID="41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08" exitCode=0 Feb 16 21:53:50 crc kubenswrapper[4777]: I0216 21:53:50.652130 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerDied","Data":"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08"} Feb 16 21:53:50 crc kubenswrapper[4777]: I0216 21:53:50.887588 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b84c748cb-vs9wb" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.664217 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerStarted","Data":"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc"} Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.701421 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fd2pn" podStartSLOduration=2.28191805 podStartE2EDuration="4.701399255s" podCreationTimestamp="2026-02-16 21:53:47 +0000 UTC" firstStartedPulling="2026-02-16 21:53:48.637079915 +0000 UTC m=+949.219581057" lastFinishedPulling="2026-02-16 21:53:51.05656115 +0000 UTC m=+951.639062262" observedRunningTime="2026-02-16 21:53:51.697744533 +0000 UTC m=+952.280245675" watchObservedRunningTime="2026-02-16 21:53:51.701399255 +0000 UTC m=+952.283900367" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.749876 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.750924 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.761390 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.761491 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ftnfz" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.764585 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pq5bx"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.767119 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.768955 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.768980 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.770823 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.849841 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/203c943d-2dd7-4534-a22d-9954b052748f-frr-startup\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850131 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-sockets\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850179 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-conf\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850213 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850242 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-reloader\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850503 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmddj\" (UniqueName: \"kubernetes.io/projected/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-kube-api-access-mmddj\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850539 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850782 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-metrics\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.850845 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlk6t\" (UniqueName: \"kubernetes.io/projected/203c943d-2dd7-4534-a22d-9954b052748f-kube-api-access-nlk6t\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.863914 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-q8kgx"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.865051 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q8kgx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.867379 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z4gwv" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.867544 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.867779 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.868985 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-q4xqv"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.870476 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.879047 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.883092 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-q4xqv"] Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.883565 4777 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952417 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952487 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmddj\" (UniqueName: \"kubernetes.io/projected/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-kube-api-access-mmddj\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952507 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metallb-excludel2\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952528 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952550 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metrics-certs\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952582 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-metrics\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952612 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlk6t\" (UniqueName: \"kubernetes.io/projected/203c943d-2dd7-4534-a22d-9954b052748f-kube-api-access-nlk6t\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952633 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-cert\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952657 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/203c943d-2dd7-4534-a22d-9954b052748f-frr-startup\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952685 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-sockets\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952704 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952736 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-conf\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952754 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbz7\" (UniqueName: \"kubernetes.io/projected/3c25c0bd-85a5-4a21-9881-81663f96e5c8-kube-api-access-5qbz7\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952771 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952789 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6drk\" (UniqueName: \"kubernetes.io/projected/3e67f384-5475-4237-bc1f-b7781bd0c8eb-kube-api-access-t6drk\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.952806 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-reloader\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.953032 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-metrics\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: E0216 21:53:51.953093 4777 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.953144 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-reloader\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: E0216 21:53:51.953168 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs podName:203c943d-2dd7-4534-a22d-9954b052748f nodeName:}" failed. No retries permitted until 2026-02-16 21:53:52.45314589 +0000 UTC m=+953.035646992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs") pod "frr-k8s-pq5bx" (UID: "203c943d-2dd7-4534-a22d-9954b052748f") : secret "frr-k8s-certs-secret" not found Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.953346 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-conf\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.953371 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/203c943d-2dd7-4534-a22d-9954b052748f-frr-sockets\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.953849 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/203c943d-2dd7-4534-a22d-9954b052748f-frr-startup\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.957801 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.971029 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmddj\" (UniqueName: \"kubernetes.io/projected/3a4b98ef-eeb8-4840-85e3-4b9ac5040e27-kube-api-access-mmddj\") pod \"frr-k8s-webhook-server-78b44bf5bb-wb5ft\" (UID: \"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:51 crc kubenswrapper[4777]: I0216 21:53:51.979394 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlk6t\" (UniqueName: \"kubernetes.io/projected/203c943d-2dd7-4534-a22d-9954b052748f-kube-api-access-nlk6t\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.053822 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6drk\" (UniqueName: \"kubernetes.io/projected/3e67f384-5475-4237-bc1f-b7781bd0c8eb-kube-api-access-t6drk\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.053878 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.053905 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metallb-excludel2\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.053932 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metrics-certs\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.053971 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-cert\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.054009 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.054030 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbz7\" (UniqueName: \"kubernetes.io/projected/3c25c0bd-85a5-4a21-9881-81663f96e5c8-kube-api-access-5qbz7\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.054046 4777 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.054119 4777 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.054126 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist podName:3e67f384-5475-4237-bc1f-b7781bd0c8eb nodeName:}" failed. No retries permitted until 2026-02-16 21:53:52.554103325 +0000 UTC m=+953.136604437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist") pod "speaker-q8kgx" (UID: "3e67f384-5475-4237-bc1f-b7781bd0c8eb") : secret "metallb-memberlist" not found Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.054171 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs podName:3c25c0bd-85a5-4a21-9881-81663f96e5c8 nodeName:}" failed. No retries permitted until 2026-02-16 21:53:52.554159987 +0000 UTC m=+953.136661089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs") pod "controller-69bbfbf88f-q4xqv" (UID: "3c25c0bd-85a5-4a21-9881-81663f96e5c8") : secret "controller-certs-secret" not found Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.054758 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metallb-excludel2\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.060283 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-cert\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.061104 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-metrics-certs\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.074247 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6drk\" (UniqueName: \"kubernetes.io/projected/3e67f384-5475-4237-bc1f-b7781bd0c8eb-kube-api-access-t6drk\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.074628 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.084812 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbz7\" (UniqueName: \"kubernetes.io/projected/3c25c0bd-85a5-4a21-9881-81663f96e5c8-kube-api-access-5qbz7\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.459905 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.464258 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/203c943d-2dd7-4534-a22d-9954b052748f-metrics-certs\") pod \"frr-k8s-pq5bx\" (UID: \"203c943d-2dd7-4534-a22d-9954b052748f\") " pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.488329 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft"] Feb 16 21:53:52 crc kubenswrapper[4777]: W0216 21:53:52.495801 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4b98ef_eeb8_4840_85e3_4b9ac5040e27.slice/crio-182c8ee00ec2eeff260bfd06bf8ecbc23795cd58b243797ce9cd3a1d43fb406e WatchSource:0}: Error finding container 182c8ee00ec2eeff260bfd06bf8ecbc23795cd58b243797ce9cd3a1d43fb406e: Status 404 returned error can't find the container with id 182c8ee00ec2eeff260bfd06bf8ecbc23795cd58b243797ce9cd3a1d43fb406e Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.562100 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.562574 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.562786 4777 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 16 21:53:52 crc kubenswrapper[4777]: E0216 21:53:52.562909 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist podName:3e67f384-5475-4237-bc1f-b7781bd0c8eb nodeName:}" failed. No retries permitted until 2026-02-16 21:53:53.562878623 +0000 UTC m=+954.145379755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist") pod "speaker-q8kgx" (UID: "3e67f384-5475-4237-bc1f-b7781bd0c8eb") : secret "metallb-memberlist" not found Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.566325 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c25c0bd-85a5-4a21-9881-81663f96e5c8-metrics-certs\") pod \"controller-69bbfbf88f-q4xqv\" (UID: \"3c25c0bd-85a5-4a21-9881-81663f96e5c8\") " pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.671565 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" event={"ID":"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27","Type":"ContainerStarted","Data":"182c8ee00ec2eeff260bfd06bf8ecbc23795cd58b243797ce9cd3a1d43fb406e"} Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.681079 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:53:52 crc kubenswrapper[4777]: I0216 21:53:52.796363 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.071056 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-q4xqv"] Feb 16 21:53:53 crc kubenswrapper[4777]: W0216 21:53:53.081855 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c25c0bd_85a5_4a21_9881_81663f96e5c8.slice/crio-9d92d74d570e664f3732b0d65a23795637bf59eb28c94e3c12bebf40e3965e08 WatchSource:0}: Error finding container 9d92d74d570e664f3732b0d65a23795637bf59eb28c94e3c12bebf40e3965e08: Status 404 returned error can't find the container with id 9d92d74d570e664f3732b0d65a23795637bf59eb28c94e3c12bebf40e3965e08 Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.577618 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.584796 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3e67f384-5475-4237-bc1f-b7781bd0c8eb-memberlist\") pod \"speaker-q8kgx\" (UID: \"3e67f384-5475-4237-bc1f-b7781bd0c8eb\") " pod="metallb-system/speaker-q8kgx" Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.680759 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"5af94d2f54a6eebc058ceffd4a40895e433f61f83a4dfec969acd560c3744ec0"} Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.683138 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-q4xqv" event={"ID":"3c25c0bd-85a5-4a21-9881-81663f96e5c8","Type":"ContainerStarted","Data":"ab1ed27deeeaa0f595f1d3134d2851ffadf7f8ffbd46c6f63b2c1e7276476f12"} Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.683174 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-q4xqv" event={"ID":"3c25c0bd-85a5-4a21-9881-81663f96e5c8","Type":"ContainerStarted","Data":"03482e8a58db42f052fc20e047942ae905b935f9d5a28f6c19026860f0de0228"} Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.683187 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-q4xqv" event={"ID":"3c25c0bd-85a5-4a21-9881-81663f96e5c8","Type":"ContainerStarted","Data":"9d92d74d570e664f3732b0d65a23795637bf59eb28c94e3c12bebf40e3965e08"} Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.683492 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.684487 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q8kgx" Feb 16 21:53:53 crc kubenswrapper[4777]: W0216 21:53:53.712691 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e67f384_5475_4237_bc1f_b7781bd0c8eb.slice/crio-f6156e2a84b74ef17810665a92eb03daac4df2e2f5dd620ab7673a347f4adcd3 WatchSource:0}: Error finding container f6156e2a84b74ef17810665a92eb03daac4df2e2f5dd620ab7673a347f4adcd3: Status 404 returned error can't find the container with id f6156e2a84b74ef17810665a92eb03daac4df2e2f5dd620ab7673a347f4adcd3 Feb 16 21:53:53 crc kubenswrapper[4777]: I0216 21:53:53.713572 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-q4xqv" podStartSLOduration=2.713538402 podStartE2EDuration="2.713538402s" podCreationTimestamp="2026-02-16 21:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:53:53.710113136 +0000 UTC m=+954.292614288" watchObservedRunningTime="2026-02-16 21:53:53.713538402 +0000 UTC m=+954.296039544" Feb 16 21:53:54 crc kubenswrapper[4777]: I0216 21:53:54.694531 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q8kgx" event={"ID":"3e67f384-5475-4237-bc1f-b7781bd0c8eb","Type":"ContainerStarted","Data":"6674e8dccbd294c1e886ebc56781b4072c468a3f86ce4f0105e176a5c5611261"} Feb 16 21:53:54 crc kubenswrapper[4777]: I0216 21:53:54.694840 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q8kgx" event={"ID":"3e67f384-5475-4237-bc1f-b7781bd0c8eb","Type":"ContainerStarted","Data":"2b39cbfb6dcae8a386e35b61c57148fe790ab4d828db3356b4e5c833a81eab83"} Feb 16 21:53:54 crc kubenswrapper[4777]: I0216 21:53:54.694853 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q8kgx" event={"ID":"3e67f384-5475-4237-bc1f-b7781bd0c8eb","Type":"ContainerStarted","Data":"f6156e2a84b74ef17810665a92eb03daac4df2e2f5dd620ab7673a347f4adcd3"} Feb 16 21:53:54 crc kubenswrapper[4777]: I0216 21:53:54.695439 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-q8kgx" Feb 16 21:53:54 crc kubenswrapper[4777]: I0216 21:53:54.718203 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-q8kgx" podStartSLOduration=3.718179585 podStartE2EDuration="3.718179585s" podCreationTimestamp="2026-02-16 21:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:53:54.713484754 +0000 UTC m=+955.295985856" watchObservedRunningTime="2026-02-16 21:53:54.718179585 +0000 UTC m=+955.300680687" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.229951 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.233830 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.248777 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.332223 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvgnn\" (UniqueName: \"kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.332488 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.332621 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.434193 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvgnn\" (UniqueName: \"kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.434280 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.434337 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.437341 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.438086 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.459089 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvgnn\" (UniqueName: \"kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn\") pod \"community-operators-jmkn5\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.555018 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.617346 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.620756 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:57 crc kubenswrapper[4777]: I0216 21:53:57.734799 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:58 crc kubenswrapper[4777]: I0216 21:53:58.770082 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:53:59 crc kubenswrapper[4777]: I0216 21:53:59.616382 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.451774 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:54:00 crc kubenswrapper[4777]: W0216 21:54:00.462759 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3453b023_837e_492c_a4ed_2cc9c01904aa.slice/crio-95370d684fe37faa9483e7d9fe8f9d7e098ac8f285c39c81999ccde8849d2aa6 WatchSource:0}: Error finding container 95370d684fe37faa9483e7d9fe8f9d7e098ac8f285c39c81999ccde8849d2aa6: Status 404 returned error can't find the container with id 95370d684fe37faa9483e7d9fe8f9d7e098ac8f285c39c81999ccde8849d2aa6 Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.734908 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerStarted","Data":"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80"} Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.735761 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerStarted","Data":"95370d684fe37faa9483e7d9fe8f9d7e098ac8f285c39c81999ccde8849d2aa6"} Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.737456 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" event={"ID":"3a4b98ef-eeb8-4840-85e3-4b9ac5040e27","Type":"ContainerStarted","Data":"3753ddff5b0ba458128160500501e5b84e6174e848c1b9d0462381038045330d"} Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.737928 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.739239 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"cbb24b60135c3400ea192400f6fda59029caa33224bb9e00090a41bfb46973eb"} Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.739405 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fd2pn" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="registry-server" containerID="cri-o://9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc" gracePeriod=2 Feb 16 21:54:00 crc kubenswrapper[4777]: I0216 21:54:00.762826 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" podStartSLOduration=1.991173727 podStartE2EDuration="9.762794755s" podCreationTimestamp="2026-02-16 21:53:51 +0000 UTC" firstStartedPulling="2026-02-16 21:53:52.497801842 +0000 UTC m=+953.080302944" lastFinishedPulling="2026-02-16 21:54:00.26942287 +0000 UTC m=+960.851923972" observedRunningTime="2026-02-16 21:54:00.754497233 +0000 UTC m=+961.336998335" watchObservedRunningTime="2026-02-16 21:54:00.762794755 +0000 UTC m=+961.345295867" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.702247 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.747132 4777 generic.go:334] "Generic (PLEG): container finished" podID="203c943d-2dd7-4534-a22d-9954b052748f" containerID="cbb24b60135c3400ea192400f6fda59029caa33224bb9e00090a41bfb46973eb" exitCode=0 Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.747205 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerDied","Data":"cbb24b60135c3400ea192400f6fda59029caa33224bb9e00090a41bfb46973eb"} Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.750742 4777 generic.go:334] "Generic (PLEG): container finished" podID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerID="ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80" exitCode=0 Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.750807 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerDied","Data":"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80"} Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.759075 4777 generic.go:334] "Generic (PLEG): container finished" podID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerID="9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc" exitCode=0 Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.759151 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fd2pn" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.759196 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerDied","Data":"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc"} Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.759224 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fd2pn" event={"ID":"63dd1e04-3f84-4c99-b1f2-84507b807f8e","Type":"ContainerDied","Data":"526482225a8b206ecd7c75947735872d9a48c218e3158289e864b3a3acb6f8d7"} Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.759242 4777 scope.go:117] "RemoveContainer" containerID="9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.788924 4777 scope.go:117] "RemoveContainer" containerID="41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.814810 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptzkz\" (UniqueName: \"kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz\") pod \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.815596 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content\") pod \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.815981 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities\") pod \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\" (UID: \"63dd1e04-3f84-4c99-b1f2-84507b807f8e\") " Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.816959 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities" (OuterVolumeSpecName: "utilities") pod "63dd1e04-3f84-4c99-b1f2-84507b807f8e" (UID: "63dd1e04-3f84-4c99-b1f2-84507b807f8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.817666 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.828315 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz" (OuterVolumeSpecName: "kube-api-access-ptzkz") pod "63dd1e04-3f84-4c99-b1f2-84507b807f8e" (UID: "63dd1e04-3f84-4c99-b1f2-84507b807f8e"). InnerVolumeSpecName "kube-api-access-ptzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.835456 4777 scope.go:117] "RemoveContainer" containerID="bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.869564 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63dd1e04-3f84-4c99-b1f2-84507b807f8e" (UID: "63dd1e04-3f84-4c99-b1f2-84507b807f8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.878223 4777 scope.go:117] "RemoveContainer" containerID="9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc" Feb 16 21:54:01 crc kubenswrapper[4777]: E0216 21:54:01.879502 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc\": container with ID starting with 9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc not found: ID does not exist" containerID="9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.879547 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc"} err="failed to get container status \"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc\": rpc error: code = NotFound desc = could not find container \"9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc\": container with ID starting with 9d9ac00752df62cdd5ac5a9e0f5aece3d27d479050109c1b857ae758b1646acc not found: ID does not exist" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.879655 4777 scope.go:117] "RemoveContainer" containerID="41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08" Feb 16 21:54:01 crc kubenswrapper[4777]: E0216 21:54:01.880164 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08\": container with ID starting with 41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08 not found: ID does not exist" containerID="41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.880224 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08"} err="failed to get container status \"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08\": rpc error: code = NotFound desc = could not find container \"41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08\": container with ID starting with 41086e247679dfe3901d781fc97111755454cfee070ee5038150552e1c4c2d08 not found: ID does not exist" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.880271 4777 scope.go:117] "RemoveContainer" containerID="bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a" Feb 16 21:54:01 crc kubenswrapper[4777]: E0216 21:54:01.880624 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a\": container with ID starting with bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a not found: ID does not exist" containerID="bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.880646 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a"} err="failed to get container status \"bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a\": rpc error: code = NotFound desc = could not find container \"bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a\": container with ID starting with bbbbb70a6fa3a0c2ce744a4d8ab944d402350a44f3c38205e2babdf38a2e950a not found: ID does not exist" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.919190 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dd1e04-3f84-4c99-b1f2-84507b807f8e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:01 crc kubenswrapper[4777]: I0216 21:54:01.919223 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptzkz\" (UniqueName: \"kubernetes.io/projected/63dd1e04-3f84-4c99-b1f2-84507b807f8e-kube-api-access-ptzkz\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.099934 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.105625 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fd2pn"] Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.197041 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" path="/var/lib/kubelet/pods/63dd1e04-3f84-4c99-b1f2-84507b807f8e/volumes" Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.773080 4777 generic.go:334] "Generic (PLEG): container finished" podID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerID="21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa" exitCode=0 Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.773138 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerDied","Data":"21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa"} Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.783925 4777 generic.go:334] "Generic (PLEG): container finished" podID="203c943d-2dd7-4534-a22d-9954b052748f" containerID="28877d419d0abc2ddafe360537aacd264044c4a386900822a661d4aae7a28685" exitCode=0 Feb 16 21:54:02 crc kubenswrapper[4777]: I0216 21:54:02.783977 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerDied","Data":"28877d419d0abc2ddafe360537aacd264044c4a386900822a661d4aae7a28685"} Feb 16 21:54:03 crc kubenswrapper[4777]: I0216 21:54:03.792095 4777 generic.go:334] "Generic (PLEG): container finished" podID="203c943d-2dd7-4534-a22d-9954b052748f" containerID="e4ef6425eb400730fcf42767aaa460ac11cb89da6bf01831fd353f545a9b8404" exitCode=0 Feb 16 21:54:03 crc kubenswrapper[4777]: I0216 21:54:03.792152 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerDied","Data":"e4ef6425eb400730fcf42767aaa460ac11cb89da6bf01831fd353f545a9b8404"} Feb 16 21:54:03 crc kubenswrapper[4777]: I0216 21:54:03.794498 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerStarted","Data":"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439"} Feb 16 21:54:03 crc kubenswrapper[4777]: I0216 21:54:03.848753 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jmkn5" podStartSLOduration=5.322920774 podStartE2EDuration="6.848704241s" podCreationTimestamp="2026-02-16 21:53:57 +0000 UTC" firstStartedPulling="2026-02-16 21:54:01.757972334 +0000 UTC m=+962.340473436" lastFinishedPulling="2026-02-16 21:54:03.283755771 +0000 UTC m=+963.866256903" observedRunningTime="2026-02-16 21:54:03.847194358 +0000 UTC m=+964.429695460" watchObservedRunningTime="2026-02-16 21:54:03.848704241 +0000 UTC m=+964.431205363" Feb 16 21:54:04 crc kubenswrapper[4777]: I0216 21:54:04.811238 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"8b53bbfe07149451996a644d5ca554abc69a2067f5056e601fabedeb77f3ae9a"} Feb 16 21:54:04 crc kubenswrapper[4777]: I0216 21:54:04.811713 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"d5de379215b8acb758364307c5009b789a5110a1946c7bb3956bb948c252fd27"} Feb 16 21:54:04 crc kubenswrapper[4777]: I0216 21:54:04.811742 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"2d341e669a70a22e1eb69fc70a00d358d299383b19dcf4b48976de5eb80dddf4"} Feb 16 21:54:04 crc kubenswrapper[4777]: I0216 21:54:04.811753 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"4a2ed438bac659add10e3a05510f3e2c6dba12b41ab6ffccebd80758f95810f8"} Feb 16 21:54:04 crc kubenswrapper[4777]: I0216 21:54:04.811764 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"b61969bd93af398925c859f437f55916e9066ddd64ef9f9a12a65c2fc085e1df"} Feb 16 21:54:05 crc kubenswrapper[4777]: I0216 21:54:05.835795 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pq5bx" event={"ID":"203c943d-2dd7-4534-a22d-9954b052748f","Type":"ContainerStarted","Data":"555fcdaae45b60be9bc7184a91ded7494d7ca6a6bc6e6286910c2e7041b5abfe"} Feb 16 21:54:05 crc kubenswrapper[4777]: I0216 21:54:05.836180 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:54:05 crc kubenswrapper[4777]: I0216 21:54:05.873381 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pq5bx" podStartSLOduration=7.406082087 podStartE2EDuration="14.873343227s" podCreationTimestamp="2026-02-16 21:53:51 +0000 UTC" firstStartedPulling="2026-02-16 21:53:52.807982762 +0000 UTC m=+953.390483864" lastFinishedPulling="2026-02-16 21:54:00.275243902 +0000 UTC m=+960.857745004" observedRunningTime="2026-02-16 21:54:05.867467502 +0000 UTC m=+966.449968714" watchObservedRunningTime="2026-02-16 21:54:05.873343227 +0000 UTC m=+966.455844379" Feb 16 21:54:07 crc kubenswrapper[4777]: I0216 21:54:07.555187 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:07 crc kubenswrapper[4777]: I0216 21:54:07.555298 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:07 crc kubenswrapper[4777]: I0216 21:54:07.639868 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:07 crc kubenswrapper[4777]: I0216 21:54:07.681634 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:54:07 crc kubenswrapper[4777]: I0216 21:54:07.758423 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.651838 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.652967 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.653078 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.654087 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.654206 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2" gracePeriod=600 Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.895677 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2" exitCode=0 Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.895764 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2"} Feb 16 21:54:11 crc kubenswrapper[4777]: I0216 21:54:11.896148 4777 scope.go:117] "RemoveContainer" containerID="a39dbf7f579cf49a749edb90301d2c78d17aa9cbdd7968d6c7cf0fe290e75857" Feb 16 21:54:12 crc kubenswrapper[4777]: I0216 21:54:12.087139 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-wb5ft" Feb 16 21:54:12 crc kubenswrapper[4777]: I0216 21:54:12.802507 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-q4xqv" Feb 16 21:54:12 crc kubenswrapper[4777]: I0216 21:54:12.906661 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f"} Feb 16 21:54:13 crc kubenswrapper[4777]: I0216 21:54:13.689647 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-q8kgx" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.639591 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:16 crc kubenswrapper[4777]: E0216 21:54:16.641007 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="extract-utilities" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.641044 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="extract-utilities" Feb 16 21:54:16 crc kubenswrapper[4777]: E0216 21:54:16.641076 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="extract-content" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.641093 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="extract-content" Feb 16 21:54:16 crc kubenswrapper[4777]: E0216 21:54:16.641114 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="registry-server" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.641132 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="registry-server" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.641481 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dd1e04-3f84-4c99-b1f2-84507b807f8e" containerName="registry-server" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.642450 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.647299 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2wksg" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.647339 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.647416 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.664897 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.753540 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffg5\" (UniqueName: \"kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5\") pod \"openstack-operator-index-9r7v5\" (UID: \"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d\") " pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.854737 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffg5\" (UniqueName: \"kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5\") pod \"openstack-operator-index-9r7v5\" (UID: \"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d\") " pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.878439 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffg5\" (UniqueName: \"kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5\") pod \"openstack-operator-index-9r7v5\" (UID: \"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d\") " pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:16 crc kubenswrapper[4777]: I0216 21:54:16.977131 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:17 crc kubenswrapper[4777]: I0216 21:54:17.471938 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:17 crc kubenswrapper[4777]: I0216 21:54:17.611764 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:17 crc kubenswrapper[4777]: I0216 21:54:17.943842 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9r7v5" event={"ID":"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d","Type":"ContainerStarted","Data":"7ddc20d120ec44c06b458ea13e79c9e934e67b7870304bb5c245b895e9286eda"} Feb 16 21:54:20 crc kubenswrapper[4777]: I0216 21:54:20.799189 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:20 crc kubenswrapper[4777]: I0216 21:54:20.971773 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9r7v5" event={"ID":"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d","Type":"ContainerStarted","Data":"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a"} Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.002705 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9r7v5" podStartSLOduration=2.31221175 podStartE2EDuration="5.002676519s" podCreationTimestamp="2026-02-16 21:54:16 +0000 UTC" firstStartedPulling="2026-02-16 21:54:17.471091122 +0000 UTC m=+978.053592254" lastFinishedPulling="2026-02-16 21:54:20.161555911 +0000 UTC m=+980.744057023" observedRunningTime="2026-02-16 21:54:20.996500026 +0000 UTC m=+981.579001178" watchObservedRunningTime="2026-02-16 21:54:21.002676519 +0000 UTC m=+981.585177661" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.612059 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ktl5l"] Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.615094 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.646346 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ktl5l"] Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.734533 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86n8\" (UniqueName: \"kubernetes.io/projected/5250d17d-4d3e-4b18-a738-04ec8120dfbb-kube-api-access-t86n8\") pod \"openstack-operator-index-ktl5l\" (UID: \"5250d17d-4d3e-4b18-a738-04ec8120dfbb\") " pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.835836 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86n8\" (UniqueName: \"kubernetes.io/projected/5250d17d-4d3e-4b18-a738-04ec8120dfbb-kube-api-access-t86n8\") pod \"openstack-operator-index-ktl5l\" (UID: \"5250d17d-4d3e-4b18-a738-04ec8120dfbb\") " pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.862267 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86n8\" (UniqueName: \"kubernetes.io/projected/5250d17d-4d3e-4b18-a738-04ec8120dfbb-kube-api-access-t86n8\") pod \"openstack-operator-index-ktl5l\" (UID: \"5250d17d-4d3e-4b18-a738-04ec8120dfbb\") " pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.954578 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:21 crc kubenswrapper[4777]: I0216 21:54:21.980984 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9r7v5" podUID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" containerName="registry-server" containerID="cri-o://9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a" gracePeriod=2 Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.345181 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.387235 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ktl5l"] Feb 16 21:54:22 crc kubenswrapper[4777]: W0216 21:54:22.397991 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5250d17d_4d3e_4b18_a738_04ec8120dfbb.slice/crio-907fd21666ad4e7b8a6e6efb789c9df96efc87875fc9453216b327edb9203be4 WatchSource:0}: Error finding container 907fd21666ad4e7b8a6e6efb789c9df96efc87875fc9453216b327edb9203be4: Status 404 returned error can't find the container with id 907fd21666ad4e7b8a6e6efb789c9df96efc87875fc9453216b327edb9203be4 Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.444098 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vffg5\" (UniqueName: \"kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5\") pod \"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d\" (UID: \"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d\") " Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.452130 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5" (OuterVolumeSpecName: "kube-api-access-vffg5") pod "092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" (UID: "092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d"). InnerVolumeSpecName "kube-api-access-vffg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.545781 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vffg5\" (UniqueName: \"kubernetes.io/projected/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d-kube-api-access-vffg5\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.687041 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pq5bx" Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.991634 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ktl5l" event={"ID":"5250d17d-4d3e-4b18-a738-04ec8120dfbb","Type":"ContainerStarted","Data":"2aefcb68c90cb8ced6b35d3d461b61b2fdda16732180dc27010a32d617e74445"} Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.991691 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ktl5l" event={"ID":"5250d17d-4d3e-4b18-a738-04ec8120dfbb","Type":"ContainerStarted","Data":"907fd21666ad4e7b8a6e6efb789c9df96efc87875fc9453216b327edb9203be4"} Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.995796 4777 generic.go:334] "Generic (PLEG): container finished" podID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" containerID="9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a" exitCode=0 Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.995846 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9r7v5" Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.995852 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9r7v5" event={"ID":"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d","Type":"ContainerDied","Data":"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a"} Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.995888 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9r7v5" event={"ID":"092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d","Type":"ContainerDied","Data":"7ddc20d120ec44c06b458ea13e79c9e934e67b7870304bb5c245b895e9286eda"} Feb 16 21:54:22 crc kubenswrapper[4777]: I0216 21:54:22.995912 4777 scope.go:117] "RemoveContainer" containerID="9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a" Feb 16 21:54:23 crc kubenswrapper[4777]: I0216 21:54:23.023197 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ktl5l" podStartSLOduration=1.96174015 podStartE2EDuration="2.023172079s" podCreationTimestamp="2026-02-16 21:54:21 +0000 UTC" firstStartedPulling="2026-02-16 21:54:22.404319681 +0000 UTC m=+982.986820783" lastFinishedPulling="2026-02-16 21:54:22.46575161 +0000 UTC m=+983.048252712" observedRunningTime="2026-02-16 21:54:23.016347108 +0000 UTC m=+983.598848220" watchObservedRunningTime="2026-02-16 21:54:23.023172079 +0000 UTC m=+983.605673191" Feb 16 21:54:23 crc kubenswrapper[4777]: I0216 21:54:23.032024 4777 scope.go:117] "RemoveContainer" containerID="9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a" Feb 16 21:54:23 crc kubenswrapper[4777]: E0216 21:54:23.032955 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a\": container with ID starting with 9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a not found: ID does not exist" containerID="9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a" Feb 16 21:54:23 crc kubenswrapper[4777]: I0216 21:54:23.033159 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a"} err="failed to get container status \"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a\": rpc error: code = NotFound desc = could not find container \"9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a\": container with ID starting with 9cb861401bef4ea048c28ba3d4fb23a37fa2fc09a21d03cbb762b9cf0dba450a not found: ID does not exist" Feb 16 21:54:23 crc kubenswrapper[4777]: I0216 21:54:23.042314 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:23 crc kubenswrapper[4777]: I0216 21:54:23.048666 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9r7v5"] Feb 16 21:54:24 crc kubenswrapper[4777]: I0216 21:54:24.200119 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" path="/var/lib/kubelet/pods/092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d/volumes" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.208716 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.209549 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jmkn5" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="registry-server" containerID="cri-o://20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439" gracePeriod=2 Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.665322 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.803125 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content\") pod \"3453b023-837e-492c-a4ed-2cc9c01904aa\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.803187 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities\") pod \"3453b023-837e-492c-a4ed-2cc9c01904aa\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.803257 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvgnn\" (UniqueName: \"kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn\") pod \"3453b023-837e-492c-a4ed-2cc9c01904aa\" (UID: \"3453b023-837e-492c-a4ed-2cc9c01904aa\") " Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.804180 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities" (OuterVolumeSpecName: "utilities") pod "3453b023-837e-492c-a4ed-2cc9c01904aa" (UID: "3453b023-837e-492c-a4ed-2cc9c01904aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.817442 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn" (OuterVolumeSpecName: "kube-api-access-zvgnn") pod "3453b023-837e-492c-a4ed-2cc9c01904aa" (UID: "3453b023-837e-492c-a4ed-2cc9c01904aa"). InnerVolumeSpecName "kube-api-access-zvgnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.861958 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3453b023-837e-492c-a4ed-2cc9c01904aa" (UID: "3453b023-837e-492c-a4ed-2cc9c01904aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.904930 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.904967 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3453b023-837e-492c-a4ed-2cc9c01904aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:25 crc kubenswrapper[4777]: I0216 21:54:25.904977 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvgnn\" (UniqueName: \"kubernetes.io/projected/3453b023-837e-492c-a4ed-2cc9c01904aa-kube-api-access-zvgnn\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.027029 4777 generic.go:334] "Generic (PLEG): container finished" podID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerID="20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439" exitCode=0 Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.027084 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmkn5" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.027084 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerDied","Data":"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439"} Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.027199 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmkn5" event={"ID":"3453b023-837e-492c-a4ed-2cc9c01904aa","Type":"ContainerDied","Data":"95370d684fe37faa9483e7d9fe8f9d7e098ac8f285c39c81999ccde8849d2aa6"} Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.027224 4777 scope.go:117] "RemoveContainer" containerID="20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.048782 4777 scope.go:117] "RemoveContainer" containerID="21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.080700 4777 scope.go:117] "RemoveContainer" containerID="ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.083270 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.089959 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jmkn5"] Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.113073 4777 scope.go:117] "RemoveContainer" containerID="20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439" Feb 16 21:54:26 crc kubenswrapper[4777]: E0216 21:54:26.113568 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439\": container with ID starting with 20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439 not found: ID does not exist" containerID="20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.113612 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439"} err="failed to get container status \"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439\": rpc error: code = NotFound desc = could not find container \"20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439\": container with ID starting with 20b087213f24ffb5a14f6177e72b1da3538127bd3b85a5988d4f4c2ed27ea439 not found: ID does not exist" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.113641 4777 scope.go:117] "RemoveContainer" containerID="21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa" Feb 16 21:54:26 crc kubenswrapper[4777]: E0216 21:54:26.113982 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa\": container with ID starting with 21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa not found: ID does not exist" containerID="21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.114011 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa"} err="failed to get container status \"21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa\": rpc error: code = NotFound desc = could not find container \"21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa\": container with ID starting with 21c22b4b9d25730ef46e70d197e1a38d6c92443c655161054356f9676ca8f8fa not found: ID does not exist" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.114031 4777 scope.go:117] "RemoveContainer" containerID="ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80" Feb 16 21:54:26 crc kubenswrapper[4777]: E0216 21:54:26.114307 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80\": container with ID starting with ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80 not found: ID does not exist" containerID="ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.114333 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80"} err="failed to get container status \"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80\": rpc error: code = NotFound desc = could not find container \"ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80\": container with ID starting with ee5572a7f5f82b4521c25602df1873d480ab3848d4044e02557106242a641d80 not found: ID does not exist" Feb 16 21:54:26 crc kubenswrapper[4777]: I0216 21:54:26.209839 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" path="/var/lib/kubelet/pods/3453b023-837e-492c-a4ed-2cc9c01904aa/volumes" Feb 16 21:54:31 crc kubenswrapper[4777]: I0216 21:54:31.955561 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:31 crc kubenswrapper[4777]: I0216 21:54:31.956139 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:32 crc kubenswrapper[4777]: I0216 21:54:32.003005 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:32 crc kubenswrapper[4777]: I0216 21:54:32.395484 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ktl5l" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.303505 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5"] Feb 16 21:54:39 crc kubenswrapper[4777]: E0216 21:54:39.304826 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="extract-content" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.304852 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="extract-content" Feb 16 21:54:39 crc kubenswrapper[4777]: E0216 21:54:39.304895 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.304909 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: E0216 21:54:39.304932 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="extract-utilities" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.304949 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="extract-utilities" Feb 16 21:54:39 crc kubenswrapper[4777]: E0216 21:54:39.304975 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.304988 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.305213 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="3453b023-837e-492c-a4ed-2cc9c01904aa" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.305246 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="092f2af9-6eaf-4b0f-b96d-8cbfeb034d6d" containerName="registry-server" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.306834 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.316201 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-46bmq" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.319925 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5"] Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.414051 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.414109 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cn26\" (UniqueName: \"kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.414204 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.515562 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cn26\" (UniqueName: \"kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.515627 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.515794 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.516436 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.516473 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.539665 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cn26\" (UniqueName: \"kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26\") pod \"02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.635230 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:39 crc kubenswrapper[4777]: I0216 21:54:39.957266 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5"] Feb 16 21:54:40 crc kubenswrapper[4777]: I0216 21:54:40.423770 4777 generic.go:334] "Generic (PLEG): container finished" podID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerID="5992f1e7023255241085385e7cde8af5bb7e6b5f59987bd51aa6a011601b706c" exitCode=0 Feb 16 21:54:40 crc kubenswrapper[4777]: I0216 21:54:40.423932 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" event={"ID":"27319f1d-e680-40c0-a521-c2f388e8af6b","Type":"ContainerDied","Data":"5992f1e7023255241085385e7cde8af5bb7e6b5f59987bd51aa6a011601b706c"} Feb 16 21:54:40 crc kubenswrapper[4777]: I0216 21:54:40.424182 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" event={"ID":"27319f1d-e680-40c0-a521-c2f388e8af6b","Type":"ContainerStarted","Data":"aacef733b0534217232a6ea8c2b7336813653ff73aec0baeb316433499993b10"} Feb 16 21:54:40 crc kubenswrapper[4777]: I0216 21:54:40.426505 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 21:54:41 crc kubenswrapper[4777]: I0216 21:54:41.455401 4777 generic.go:334] "Generic (PLEG): container finished" podID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerID="d971519254d6e02bc0a71b31ee1885ad0d2ccf59367d81eaa250529b3c3a5c4f" exitCode=0 Feb 16 21:54:41 crc kubenswrapper[4777]: I0216 21:54:41.455476 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" event={"ID":"27319f1d-e680-40c0-a521-c2f388e8af6b","Type":"ContainerDied","Data":"d971519254d6e02bc0a71b31ee1885ad0d2ccf59367d81eaa250529b3c3a5c4f"} Feb 16 21:54:42 crc kubenswrapper[4777]: I0216 21:54:42.468816 4777 generic.go:334] "Generic (PLEG): container finished" podID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerID="c7bd8b1dbdbc64ec35a304ad89764be31ae2bcd89ac21162243752173e56b313" exitCode=0 Feb 16 21:54:42 crc kubenswrapper[4777]: I0216 21:54:42.469083 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" event={"ID":"27319f1d-e680-40c0-a521-c2f388e8af6b","Type":"ContainerDied","Data":"c7bd8b1dbdbc64ec35a304ad89764be31ae2bcd89ac21162243752173e56b313"} Feb 16 21:54:43 crc kubenswrapper[4777]: I0216 21:54:43.879665 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:43 crc kubenswrapper[4777]: I0216 21:54:43.985673 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cn26\" (UniqueName: \"kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26\") pod \"27319f1d-e680-40c0-a521-c2f388e8af6b\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " Feb 16 21:54:43 crc kubenswrapper[4777]: I0216 21:54:43.985844 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle\") pod \"27319f1d-e680-40c0-a521-c2f388e8af6b\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " Feb 16 21:54:43 crc kubenswrapper[4777]: I0216 21:54:43.986100 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util\") pod \"27319f1d-e680-40c0-a521-c2f388e8af6b\" (UID: \"27319f1d-e680-40c0-a521-c2f388e8af6b\") " Feb 16 21:54:43 crc kubenswrapper[4777]: I0216 21:54:43.988181 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle" (OuterVolumeSpecName: "bundle") pod "27319f1d-e680-40c0-a521-c2f388e8af6b" (UID: "27319f1d-e680-40c0-a521-c2f388e8af6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.007303 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26" (OuterVolumeSpecName: "kube-api-access-7cn26") pod "27319f1d-e680-40c0-a521-c2f388e8af6b" (UID: "27319f1d-e680-40c0-a521-c2f388e8af6b"). InnerVolumeSpecName "kube-api-access-7cn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.046272 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util" (OuterVolumeSpecName: "util") pod "27319f1d-e680-40c0-a521-c2f388e8af6b" (UID: "27319f1d-e680-40c0-a521-c2f388e8af6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.088100 4777 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-util\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.088136 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cn26\" (UniqueName: \"kubernetes.io/projected/27319f1d-e680-40c0-a521-c2f388e8af6b-kube-api-access-7cn26\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.088152 4777 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27319f1d-e680-40c0-a521-c2f388e8af6b-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.492974 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" event={"ID":"27319f1d-e680-40c0-a521-c2f388e8af6b","Type":"ContainerDied","Data":"aacef733b0534217232a6ea8c2b7336813653ff73aec0baeb316433499993b10"} Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.493039 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aacef733b0534217232a6ea8c2b7336813653ff73aec0baeb316433499993b10" Feb 16 21:54:44 crc kubenswrapper[4777]: I0216 21:54:44.493068 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.181905 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn"] Feb 16 21:54:47 crc kubenswrapper[4777]: E0216 21:54:47.182786 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="extract" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.182810 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="extract" Feb 16 21:54:47 crc kubenswrapper[4777]: E0216 21:54:47.182845 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="util" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.182859 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="util" Feb 16 21:54:47 crc kubenswrapper[4777]: E0216 21:54:47.182886 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="pull" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.182899 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="pull" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.183151 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="27319f1d-e680-40c0-a521-c2f388e8af6b" containerName="extract" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.183883 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.190098 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-lzwgk" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.207980 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn"] Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.334260 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llpbs\" (UniqueName: \"kubernetes.io/projected/df21ebae-039c-40c9-9b98-2ecb22a01bdb-kube-api-access-llpbs\") pod \"openstack-operator-controller-init-7845fcf9cf-qf9zn\" (UID: \"df21ebae-039c-40c9-9b98-2ecb22a01bdb\") " pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.435854 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llpbs\" (UniqueName: \"kubernetes.io/projected/df21ebae-039c-40c9-9b98-2ecb22a01bdb-kube-api-access-llpbs\") pod \"openstack-operator-controller-init-7845fcf9cf-qf9zn\" (UID: \"df21ebae-039c-40c9-9b98-2ecb22a01bdb\") " pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.458693 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llpbs\" (UniqueName: \"kubernetes.io/projected/df21ebae-039c-40c9-9b98-2ecb22a01bdb-kube-api-access-llpbs\") pod \"openstack-operator-controller-init-7845fcf9cf-qf9zn\" (UID: \"df21ebae-039c-40c9-9b98-2ecb22a01bdb\") " pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.505993 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:47 crc kubenswrapper[4777]: I0216 21:54:47.951792 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn"] Feb 16 21:54:48 crc kubenswrapper[4777]: I0216 21:54:48.522267 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" event={"ID":"df21ebae-039c-40c9-9b98-2ecb22a01bdb","Type":"ContainerStarted","Data":"2a86e4d5e6792979c3b009b8a7cfe45bc455ea3e3eb5f7e4d065182f29470651"} Feb 16 21:54:48 crc kubenswrapper[4777]: I0216 21:54:48.945880 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:54:48 crc kubenswrapper[4777]: I0216 21:54:48.947225 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:48 crc kubenswrapper[4777]: I0216 21:54:48.955466 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.059523 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.059793 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfh4\" (UniqueName: \"kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.059879 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.161846 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfh4\" (UniqueName: \"kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.161899 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.161955 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.162778 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.162784 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.186438 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfh4\" (UniqueName: \"kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4\") pod \"redhat-marketplace-fgr6v\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:49 crc kubenswrapper[4777]: I0216 21:54:49.311451 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:51 crc kubenswrapper[4777]: I0216 21:54:51.947593 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:54:51 crc kubenswrapper[4777]: W0216 21:54:51.954578 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d993291_c4a3_4dcc_9178_4a06b9ff996e.slice/crio-b4ef0552671a6eca6d9ac6ef94b7033da13701f72ccea210e3b53fa096a7db66 WatchSource:0}: Error finding container b4ef0552671a6eca6d9ac6ef94b7033da13701f72ccea210e3b53fa096a7db66: Status 404 returned error can't find the container with id b4ef0552671a6eca6d9ac6ef94b7033da13701f72ccea210e3b53fa096a7db66 Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.562112 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" event={"ID":"df21ebae-039c-40c9-9b98-2ecb22a01bdb","Type":"ContainerStarted","Data":"3dfdf28f38bc27a212a2629cdf1657d62f4a7ce62399894d664efbd2bf789279"} Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.562582 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.564299 4777 generic.go:334] "Generic (PLEG): container finished" podID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerID="d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466" exitCode=0 Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.564364 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerDied","Data":"d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466"} Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.564399 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerStarted","Data":"b4ef0552671a6eca6d9ac6ef94b7033da13701f72ccea210e3b53fa096a7db66"} Feb 16 21:54:52 crc kubenswrapper[4777]: I0216 21:54:52.628019 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" podStartSLOduration=1.785911501 podStartE2EDuration="5.627983186s" podCreationTimestamp="2026-02-16 21:54:47 +0000 UTC" firstStartedPulling="2026-02-16 21:54:47.967103068 +0000 UTC m=+1008.549604170" lastFinishedPulling="2026-02-16 21:54:51.809174753 +0000 UTC m=+1012.391675855" observedRunningTime="2026-02-16 21:54:52.611174806 +0000 UTC m=+1013.193675938" watchObservedRunningTime="2026-02-16 21:54:52.627983186 +0000 UTC m=+1013.210484328" Feb 16 21:54:53 crc kubenswrapper[4777]: I0216 21:54:53.577016 4777 generic.go:334] "Generic (PLEG): container finished" podID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerID="14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e" exitCode=0 Feb 16 21:54:53 crc kubenswrapper[4777]: I0216 21:54:53.577143 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerDied","Data":"14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e"} Feb 16 21:54:54 crc kubenswrapper[4777]: I0216 21:54:54.590512 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerStarted","Data":"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc"} Feb 16 21:54:54 crc kubenswrapper[4777]: I0216 21:54:54.626330 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgr6v" podStartSLOduration=5.208741497 podStartE2EDuration="6.626306465s" podCreationTimestamp="2026-02-16 21:54:48 +0000 UTC" firstStartedPulling="2026-02-16 21:54:52.566762253 +0000 UTC m=+1013.149263355" lastFinishedPulling="2026-02-16 21:54:53.984327191 +0000 UTC m=+1014.566828323" observedRunningTime="2026-02-16 21:54:54.622602192 +0000 UTC m=+1015.205103324" watchObservedRunningTime="2026-02-16 21:54:54.626306465 +0000 UTC m=+1015.208807577" Feb 16 21:54:57 crc kubenswrapper[4777]: I0216 21:54:57.509922 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7845fcf9cf-qf9zn" Feb 16 21:54:59 crc kubenswrapper[4777]: I0216 21:54:59.312047 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:59 crc kubenswrapper[4777]: I0216 21:54:59.312497 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:59 crc kubenswrapper[4777]: I0216 21:54:59.365672 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:54:59 crc kubenswrapper[4777]: I0216 21:54:59.703822 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:55:01 crc kubenswrapper[4777]: I0216 21:55:01.740234 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:55:01 crc kubenswrapper[4777]: I0216 21:55:01.741057 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgr6v" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="registry-server" containerID="cri-o://499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc" gracePeriod=2 Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.176368 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.260508 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tfh4\" (UniqueName: \"kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4\") pod \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.260605 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content\") pod \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.260819 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities\") pod \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\" (UID: \"3d993291-c4a3-4dcc-9178-4a06b9ff996e\") " Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.262916 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities" (OuterVolumeSpecName: "utilities") pod "3d993291-c4a3-4dcc-9178-4a06b9ff996e" (UID: "3d993291-c4a3-4dcc-9178-4a06b9ff996e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.270963 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4" (OuterVolumeSpecName: "kube-api-access-8tfh4") pod "3d993291-c4a3-4dcc-9178-4a06b9ff996e" (UID: "3d993291-c4a3-4dcc-9178-4a06b9ff996e"). InnerVolumeSpecName "kube-api-access-8tfh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.289798 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d993291-c4a3-4dcc-9178-4a06b9ff996e" (UID: "3d993291-c4a3-4dcc-9178-4a06b9ff996e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.363240 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.363298 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tfh4\" (UniqueName: \"kubernetes.io/projected/3d993291-c4a3-4dcc-9178-4a06b9ff996e-kube-api-access-8tfh4\") on node \"crc\" DevicePath \"\"" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.363323 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d993291-c4a3-4dcc-9178-4a06b9ff996e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.663439 4777 generic.go:334] "Generic (PLEG): container finished" podID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerID="499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc" exitCode=0 Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.663503 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerDied","Data":"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc"} Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.663530 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgr6v" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.663555 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgr6v" event={"ID":"3d993291-c4a3-4dcc-9178-4a06b9ff996e","Type":"ContainerDied","Data":"b4ef0552671a6eca6d9ac6ef94b7033da13701f72ccea210e3b53fa096a7db66"} Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.663584 4777 scope.go:117] "RemoveContainer" containerID="499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.688593 4777 scope.go:117] "RemoveContainer" containerID="14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.706188 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.729124 4777 scope.go:117] "RemoveContainer" containerID="d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.729868 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgr6v"] Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.743545 4777 scope.go:117] "RemoveContainer" containerID="499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc" Feb 16 21:55:02 crc kubenswrapper[4777]: E0216 21:55:02.744940 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc\": container with ID starting with 499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc not found: ID does not exist" containerID="499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.744986 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc"} err="failed to get container status \"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc\": rpc error: code = NotFound desc = could not find container \"499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc\": container with ID starting with 499271efa75a21a2315d330d1a5df447ebecb3c36a18e1d372a2294ce14dc3bc not found: ID does not exist" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.745018 4777 scope.go:117] "RemoveContainer" containerID="14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e" Feb 16 21:55:02 crc kubenswrapper[4777]: E0216 21:55:02.745369 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e\": container with ID starting with 14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e not found: ID does not exist" containerID="14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.745390 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e"} err="failed to get container status \"14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e\": rpc error: code = NotFound desc = could not find container \"14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e\": container with ID starting with 14d5043658aea6a0e76f84e55f4a50faa8865949fa66cb297f2c5dfc1dd3769e not found: ID does not exist" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.745403 4777 scope.go:117] "RemoveContainer" containerID="d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466" Feb 16 21:55:02 crc kubenswrapper[4777]: E0216 21:55:02.745633 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466\": container with ID starting with d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466 not found: ID does not exist" containerID="d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466" Feb 16 21:55:02 crc kubenswrapper[4777]: I0216 21:55:02.745678 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466"} err="failed to get container status \"d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466\": rpc error: code = NotFound desc = could not find container \"d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466\": container with ID starting with d6728363648aca83b6b60f6eed2065cafd6bbdc8c6cb4f3bd7a3398e5de0e466 not found: ID does not exist" Feb 16 21:55:04 crc kubenswrapper[4777]: I0216 21:55:04.189426 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" path="/var/lib/kubelet/pods/3d993291-c4a3-4dcc-9178-4a06b9ff996e/volumes" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.862915 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl"] Feb 16 21:55:19 crc kubenswrapper[4777]: E0216 21:55:19.864923 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="extract-utilities" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.865022 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="extract-utilities" Feb 16 21:55:19 crc kubenswrapper[4777]: E0216 21:55:19.865187 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="registry-server" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.865257 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="registry-server" Feb 16 21:55:19 crc kubenswrapper[4777]: E0216 21:55:19.865339 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="extract-content" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.865407 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="extract-content" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.865628 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d993291-c4a3-4dcc-9178-4a06b9ff996e" containerName="registry-server" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.866386 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.866574 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.868153 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.871247 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8dcpz" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.871431 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-m6sht" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.876216 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.880642 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.887836 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.888752 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.890792 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7nv4h" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.891765 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6km9d"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.892603 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.895893 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dw9q6" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.927773 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6km9d"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.929948 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.954932 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v"] Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.955769 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.962087 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-5hb8n" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.969272 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5pn9\" (UniqueName: \"kubernetes.io/projected/0994c976-c6c3-42e0-9b54-e09d2fac5447-kube-api-access-l5pn9\") pod \"cinder-operator-controller-manager-5d946d989d-rxd87\" (UID: \"0994c976-c6c3-42e0-9b54-e09d2fac5447\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.969338 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/7f900c19-63a0-40b9-a8c4-c51883a5087e-kube-api-access-9lfmh\") pod \"glance-operator-controller-manager-77987464f4-6km9d\" (UID: \"7f900c19-63a0-40b9-a8c4-c51883a5087e\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.969377 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbcd\" (UniqueName: \"kubernetes.io/projected/f4b16894-e8b0-4476-94ae-c6112200fa5d-kube-api-access-qfbcd\") pod \"designate-operator-controller-manager-6d8bf5c495-ssfj6\" (UID: \"f4b16894-e8b0-4476-94ae-c6112200fa5d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.969399 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4285z\" (UniqueName: \"kubernetes.io/projected/367fde39-a8cd-443a-83f8-f9cb0e9f3576-kube-api-access-4285z\") pod \"barbican-operator-controller-manager-868647ff47-k22zl\" (UID: \"367fde39-a8cd-443a-83f8-f9cb0e9f3576\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:19 crc kubenswrapper[4777]: I0216 21:55:19.969464 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.008500 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-7shhv"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.010730 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.020322 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hz79m" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.020397 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.038996 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.044491 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.048219 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lmm4q" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.071780 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072499 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/7f900c19-63a0-40b9-a8c4-c51883a5087e-kube-api-access-9lfmh\") pod \"glance-operator-controller-manager-77987464f4-6km9d\" (UID: \"7f900c19-63a0-40b9-a8c4-c51883a5087e\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072561 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbcd\" (UniqueName: \"kubernetes.io/projected/f4b16894-e8b0-4476-94ae-c6112200fa5d-kube-api-access-qfbcd\") pod \"designate-operator-controller-manager-6d8bf5c495-ssfj6\" (UID: \"f4b16894-e8b0-4476-94ae-c6112200fa5d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072591 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4285z\" (UniqueName: \"kubernetes.io/projected/367fde39-a8cd-443a-83f8-f9cb0e9f3576-kube-api-access-4285z\") pod \"barbican-operator-controller-manager-868647ff47-k22zl\" (UID: \"367fde39-a8cd-443a-83f8-f9cb0e9f3576\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072628 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfbtf\" (UniqueName: \"kubernetes.io/projected/3269da89-1f17-44af-8ea9-83eb493249a2-kube-api-access-lfbtf\") pod \"horizon-operator-controller-manager-5b9b8895d5-v2m7v\" (UID: \"3269da89-1f17-44af-8ea9-83eb493249a2\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072653 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5pn9\" (UniqueName: \"kubernetes.io/projected/0994c976-c6c3-42e0-9b54-e09d2fac5447-kube-api-access-l5pn9\") pod \"cinder-operator-controller-manager-5d946d989d-rxd87\" (UID: \"0994c976-c6c3-42e0-9b54-e09d2fac5447\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.072738 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.075941 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-7shhv"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.081188 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-99tkj" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.109620 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.110907 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/7f900c19-63a0-40b9-a8c4-c51883a5087e-kube-api-access-9lfmh\") pod \"glance-operator-controller-manager-77987464f4-6km9d\" (UID: \"7f900c19-63a0-40b9-a8c4-c51883a5087e\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.119524 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbcd\" (UniqueName: \"kubernetes.io/projected/f4b16894-e8b0-4476-94ae-c6112200fa5d-kube-api-access-qfbcd\") pod \"designate-operator-controller-manager-6d8bf5c495-ssfj6\" (UID: \"f4b16894-e8b0-4476-94ae-c6112200fa5d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.119526 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5pn9\" (UniqueName: \"kubernetes.io/projected/0994c976-c6c3-42e0-9b54-e09d2fac5447-kube-api-access-l5pn9\") pod \"cinder-operator-controller-manager-5d946d989d-rxd87\" (UID: \"0994c976-c6c3-42e0-9b54-e09d2fac5447\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.122294 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.133785 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.135159 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.145150 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mqxwc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.152168 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4285z\" (UniqueName: \"kubernetes.io/projected/367fde39-a8cd-443a-83f8-f9cb0e9f3576-kube-api-access-4285z\") pod \"barbican-operator-controller-manager-868647ff47-k22zl\" (UID: \"367fde39-a8cd-443a-83f8-f9cb0e9f3576\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.153928 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.177190 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.178199 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.178859 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lppsw\" (UniqueName: \"kubernetes.io/projected/522da514-4a99-4293-8fe8-45285b4f24eb-kube-api-access-lppsw\") pod \"heat-operator-controller-manager-69f49c598c-ss4k5\" (UID: \"522da514-4a99-4293-8fe8-45285b4f24eb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.178936 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.178975 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfbtf\" (UniqueName: \"kubernetes.io/projected/3269da89-1f17-44af-8ea9-83eb493249a2-kube-api-access-lfbtf\") pod \"horizon-operator-controller-manager-5b9b8895d5-v2m7v\" (UID: \"3269da89-1f17-44af-8ea9-83eb493249a2\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.179033 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpjx\" (UniqueName: \"kubernetes.io/projected/c6f0dee9-40e3-42d5-8651-b340e075891c-kube-api-access-nmpjx\") pod \"ironic-operator-controller-manager-554564d7fc-9lpfx\" (UID: \"c6f0dee9-40e3-42d5-8651-b340e075891c\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.179085 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcp2\" (UniqueName: \"kubernetes.io/projected/1d6bef33-b14f-435b-aa17-dfbed9c15d86-kube-api-access-wkcp2\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.184605 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tzfwg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.191257 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.214200 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.233088 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.243974 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfbtf\" (UniqueName: \"kubernetes.io/projected/3269da89-1f17-44af-8ea9-83eb493249a2-kube-api-access-lfbtf\") pod \"horizon-operator-controller-manager-5b9b8895d5-v2m7v\" (UID: \"3269da89-1f17-44af-8ea9-83eb493249a2\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.244059 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.249528 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.256334 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.257304 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.261414 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b9n9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.289539 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292167 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgt6\" (UniqueName: \"kubernetes.io/projected/583dde29-d4a1-4ec2-9021-3735d6c5717b-kube-api-access-krgt6\") pod \"keystone-operator-controller-manager-b4d948c87-slq9f\" (UID: \"583dde29-d4a1-4ec2-9021-3735d6c5717b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292236 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lppsw\" (UniqueName: \"kubernetes.io/projected/522da514-4a99-4293-8fe8-45285b4f24eb-kube-api-access-lppsw\") pod \"heat-operator-controller-manager-69f49c598c-ss4k5\" (UID: \"522da514-4a99-4293-8fe8-45285b4f24eb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292294 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292330 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmpjx\" (UniqueName: \"kubernetes.io/projected/c6f0dee9-40e3-42d5-8651-b340e075891c-kube-api-access-nmpjx\") pod \"ironic-operator-controller-manager-554564d7fc-9lpfx\" (UID: \"c6f0dee9-40e3-42d5-8651-b340e075891c\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292424 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcp2\" (UniqueName: \"kubernetes.io/projected/1d6bef33-b14f-435b-aa17-dfbed9c15d86-kube-api-access-wkcp2\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.292477 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cwv\" (UniqueName: \"kubernetes.io/projected/49e70c01-9a6c-488a-a766-a430583352fb-kube-api-access-69cwv\") pod \"manila-operator-controller-manager-54f6768c69-v2dbc\" (UID: \"49e70c01-9a6c-488a-a766-a430583352fb\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.294580 4777 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.294643 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert podName:1d6bef33-b14f-435b-aa17-dfbed9c15d86 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:20.794623197 +0000 UTC m=+1041.377124299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert") pod "infra-operator-controller-manager-79d975b745-7shhv" (UID: "1d6bef33-b14f-435b-aa17-dfbed9c15d86") : secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.309573 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.311004 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.319329 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xlfvg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.338341 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lppsw\" (UniqueName: \"kubernetes.io/projected/522da514-4a99-4293-8fe8-45285b4f24eb-kube-api-access-lppsw\") pod \"heat-operator-controller-manager-69f49c598c-ss4k5\" (UID: \"522da514-4a99-4293-8fe8-45285b4f24eb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.357398 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcp2\" (UniqueName: \"kubernetes.io/projected/1d6bef33-b14f-435b-aa17-dfbed9c15d86-kube-api-access-wkcp2\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.357803 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmpjx\" (UniqueName: \"kubernetes.io/projected/c6f0dee9-40e3-42d5-8651-b340e075891c-kube-api-access-nmpjx\") pod \"ironic-operator-controller-manager-554564d7fc-9lpfx\" (UID: \"c6f0dee9-40e3-42d5-8651-b340e075891c\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.369200 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.394694 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.397263 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.399099 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrj5\" (UniqueName: \"kubernetes.io/projected/35deb9f9-3cf5-4f92-9ba0-5f8eac14733a-kube-api-access-gwrj5\") pod \"neutron-operator-controller-manager-64ddbf8bb-nwzcm\" (UID: \"35deb9f9-3cf5-4f92-9ba0-5f8eac14733a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.399211 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgt6\" (UniqueName: \"kubernetes.io/projected/583dde29-d4a1-4ec2-9021-3735d6c5717b-kube-api-access-krgt6\") pod \"keystone-operator-controller-manager-b4d948c87-slq9f\" (UID: \"583dde29-d4a1-4ec2-9021-3735d6c5717b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.399348 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726lm\" (UniqueName: \"kubernetes.io/projected/d255b1d6-45e9-4749-8252-b77b97cd940a-kube-api-access-726lm\") pod \"mariadb-operator-controller-manager-6994f66f48-znh8t\" (UID: \"d255b1d6-45e9-4749-8252-b77b97cd940a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.399436 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cwv\" (UniqueName: \"kubernetes.io/projected/49e70c01-9a6c-488a-a766-a430583352fb-kube-api-access-69cwv\") pod \"manila-operator-controller-manager-54f6768c69-v2dbc\" (UID: \"49e70c01-9a6c-488a-a766-a430583352fb\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.400095 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rz58b" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.425349 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.426676 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.440507 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c78v8" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.471907 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cwv\" (UniqueName: \"kubernetes.io/projected/49e70c01-9a6c-488a-a766-a430583352fb-kube-api-access-69cwv\") pod \"manila-operator-controller-manager-54f6768c69-v2dbc\" (UID: \"49e70c01-9a6c-488a-a766-a430583352fb\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.474819 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgt6\" (UniqueName: \"kubernetes.io/projected/583dde29-d4a1-4ec2-9021-3735d6c5717b-kube-api-access-krgt6\") pod \"keystone-operator-controller-manager-b4d948c87-slq9f\" (UID: \"583dde29-d4a1-4ec2-9021-3735d6c5717b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.485333 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.502411 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrj5\" (UniqueName: \"kubernetes.io/projected/35deb9f9-3cf5-4f92-9ba0-5f8eac14733a-kube-api-access-gwrj5\") pod \"neutron-operator-controller-manager-64ddbf8bb-nwzcm\" (UID: \"35deb9f9-3cf5-4f92-9ba0-5f8eac14733a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.502531 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6bmj\" (UniqueName: \"kubernetes.io/projected/c69b97e2-c7ed-4b0a-9e29-fcd457d7a453-kube-api-access-b6bmj\") pod \"nova-operator-controller-manager-567668f5cf-mg48d\" (UID: \"c69b97e2-c7ed-4b0a-9e29-fcd457d7a453\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.502560 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726lm\" (UniqueName: \"kubernetes.io/projected/d255b1d6-45e9-4749-8252-b77b97cd940a-kube-api-access-726lm\") pod \"mariadb-operator-controller-manager-6994f66f48-znh8t\" (UID: \"d255b1d6-45e9-4749-8252-b77b97cd940a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.502589 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx86v\" (UniqueName: \"kubernetes.io/projected/9fbfe46d-cddb-41eb-920a-0859f6fcbb27-kube-api-access-wx86v\") pod \"octavia-operator-controller-manager-69f8888797-x2pd2\" (UID: \"9fbfe46d-cddb-41eb-920a-0859f6fcbb27\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.502827 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.522476 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.554392 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrj5\" (UniqueName: \"kubernetes.io/projected/35deb9f9-3cf5-4f92-9ba0-5f8eac14733a-kube-api-access-gwrj5\") pod \"neutron-operator-controller-manager-64ddbf8bb-nwzcm\" (UID: \"35deb9f9-3cf5-4f92-9ba0-5f8eac14733a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.557243 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.562701 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726lm\" (UniqueName: \"kubernetes.io/projected/d255b1d6-45e9-4749-8252-b77b97cd940a-kube-api-access-726lm\") pod \"mariadb-operator-controller-manager-6994f66f48-znh8t\" (UID: \"d255b1d6-45e9-4749-8252-b77b97cd940a\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.571137 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.580512 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.589404 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.590357 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.596948 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.597805 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.607592 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.607934 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jkhwm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.608045 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dmrlb" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.610160 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.610951 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.611199 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.612508 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6bmj\" (UniqueName: \"kubernetes.io/projected/c69b97e2-c7ed-4b0a-9e29-fcd457d7a453-kube-api-access-b6bmj\") pod \"nova-operator-controller-manager-567668f5cf-mg48d\" (UID: \"c69b97e2-c7ed-4b0a-9e29-fcd457d7a453\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.612566 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx86v\" (UniqueName: \"kubernetes.io/projected/9fbfe46d-cddb-41eb-920a-0859f6fcbb27-kube-api-access-wx86v\") pod \"octavia-operator-controller-manager-69f8888797-x2pd2\" (UID: \"9fbfe46d-cddb-41eb-920a-0859f6fcbb27\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.625935 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.627818 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.630053 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx86v\" (UniqueName: \"kubernetes.io/projected/9fbfe46d-cddb-41eb-920a-0859f6fcbb27-kube-api-access-wx86v\") pod \"octavia-operator-controller-manager-69f8888797-x2pd2\" (UID: \"9fbfe46d-cddb-41eb-920a-0859f6fcbb27\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.631031 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hf9zx" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.641233 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.652607 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6bmj\" (UniqueName: \"kubernetes.io/projected/c69b97e2-c7ed-4b0a-9e29-fcd457d7a453-kube-api-access-b6bmj\") pod \"nova-operator-controller-manager-567668f5cf-mg48d\" (UID: \"c69b97e2-c7ed-4b0a-9e29-fcd457d7a453\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.655311 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.656292 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.700767 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b8thc" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.704511 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.735898 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.767012 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5gt\" (UniqueName: \"kubernetes.io/projected/c853fc19-52e2-407b-ac21-638cdf255085-kube-api-access-xg5gt\") pod \"placement-operator-controller-manager-8497b45c89-p9lqg\" (UID: \"c853fc19-52e2-407b-ac21-638cdf255085\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.767227 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69j5\" (UniqueName: \"kubernetes.io/projected/b455e5b7-ca2c-4f12-91de-da9969199670-kube-api-access-x69j5\") pod \"ovn-operator-controller-manager-d44cf6b75-k4297\" (UID: \"b455e5b7-ca2c-4f12-91de-da9969199670\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.767384 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxlp\" (UniqueName: \"kubernetes.io/projected/1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d-kube-api-access-mhxlp\") pod \"swift-operator-controller-manager-68f46476f-g8lbd\" (UID: \"1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.767466 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.767511 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xghs\" (UniqueName: \"kubernetes.io/projected/3a482208-548a-47dc-aa40-c23cbcdf5363-kube-api-access-7xghs\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.825824 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.836240 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.839437 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.842356 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5b6rv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.845664 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.846738 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.847649 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.860046 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mzrs2"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.861409 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.864811 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nxjg8" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.868697 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxlp\" (UniqueName: \"kubernetes.io/projected/1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d-kube-api-access-mhxlp\") pod \"swift-operator-controller-manager-68f46476f-g8lbd\" (UID: \"1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.868849 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.868924 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xghs\" (UniqueName: \"kubernetes.io/projected/3a482208-548a-47dc-aa40-c23cbcdf5363-kube-api-access-7xghs\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.869052 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5gt\" (UniqueName: \"kubernetes.io/projected/c853fc19-52e2-407b-ac21-638cdf255085-kube-api-access-xg5gt\") pod \"placement-operator-controller-manager-8497b45c89-p9lqg\" (UID: \"c853fc19-52e2-407b-ac21-638cdf255085\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.869532 4777 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.869609 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert podName:3a482208-548a-47dc-aa40-c23cbcdf5363 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:21.369586716 +0000 UTC m=+1041.952087808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" (UID: "3a482208-548a-47dc-aa40-c23cbcdf5363") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.870000 4777 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: E0216 21:55:20.870032 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert podName:1d6bef33-b14f-435b-aa17-dfbed9c15d86 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:21.870024269 +0000 UTC m=+1042.452525371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert") pod "infra-operator-controller-manager-79d975b745-7shhv" (UID: "1d6bef33-b14f-435b-aa17-dfbed9c15d86") : secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.870412 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.870548 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x69j5\" (UniqueName: \"kubernetes.io/projected/b455e5b7-ca2c-4f12-91de-da9969199670-kube-api-access-x69j5\") pod \"ovn-operator-controller-manager-d44cf6b75-k4297\" (UID: \"b455e5b7-ca2c-4f12-91de-da9969199670\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.871200 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mzrs2"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.885814 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.886820 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.888301 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xghs\" (UniqueName: \"kubernetes.io/projected/3a482208-548a-47dc-aa40-c23cbcdf5363-kube-api-access-7xghs\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.888619 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxlp\" (UniqueName: \"kubernetes.io/projected/1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d-kube-api-access-mhxlp\") pod \"swift-operator-controller-manager-68f46476f-g8lbd\" (UID: \"1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.892312 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-f5c7n" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.893553 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69j5\" (UniqueName: \"kubernetes.io/projected/b455e5b7-ca2c-4f12-91de-da9969199670-kube-api-access-x69j5\") pod \"ovn-operator-controller-manager-d44cf6b75-k4297\" (UID: \"b455e5b7-ca2c-4f12-91de-da9969199670\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.893616 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.911319 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5gt\" (UniqueName: \"kubernetes.io/projected/c853fc19-52e2-407b-ac21-638cdf255085-kube-api-access-xg5gt\") pod \"placement-operator-controller-manager-8497b45c89-p9lqg\" (UID: \"c853fc19-52e2-407b-ac21-638cdf255085\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.934357 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.935684 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.941452 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.944447 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-shdxw" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.944630 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.952357 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.952652 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.964864 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6"] Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.967043 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.973298 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4kw\" (UniqueName: \"kubernetes.io/projected/20881f3c-e228-4c63-9eb9-dbedd55ffc14-kube-api-access-qz4kw\") pod \"test-operator-controller-manager-7866795846-mzrs2\" (UID: \"20881f3c-e228-4c63-9eb9-dbedd55ffc14\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.973354 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxf5s\" (UniqueName: \"kubernetes.io/projected/1be584eb-83c6-4709-8f10-60474c6f3f93-kube-api-access-wxf5s\") pod \"telemetry-operator-controller-manager-79996fd568-n7rmk\" (UID: \"1be584eb-83c6-4709-8f10-60474c6f3f93\") " pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.974104 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-f48fw" Feb 16 21:55:20 crc kubenswrapper[4777]: I0216 21:55:20.976395 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6"] Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.017373 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.017762 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074654 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24d4\" (UniqueName: \"kubernetes.io/projected/c1a0562b-8960-485e-9c6a-da5a73a18180-kube-api-access-p24d4\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074755 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4kw\" (UniqueName: \"kubernetes.io/projected/20881f3c-e228-4c63-9eb9-dbedd55ffc14-kube-api-access-qz4kw\") pod \"test-operator-controller-manager-7866795846-mzrs2\" (UID: \"20881f3c-e228-4c63-9eb9-dbedd55ffc14\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074787 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxf5s\" (UniqueName: \"kubernetes.io/projected/1be584eb-83c6-4709-8f10-60474c6f3f93-kube-api-access-wxf5s\") pod \"telemetry-operator-controller-manager-79996fd568-n7rmk\" (UID: \"1be584eb-83c6-4709-8f10-60474c6f3f93\") " pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074806 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpc9\" (UniqueName: \"kubernetes.io/projected/7afb9e38-ac2c-483f-a98e-716bfa22ee6d-kube-api-access-6cpc9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4ld6\" (UID: \"7afb9e38-ac2c-483f-a98e-716bfa22ee6d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074829 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074854 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgm8s\" (UniqueName: \"kubernetes.io/projected/a565ae46-9b4e-4310-8867-230d4a65eea5-kube-api-access-hgm8s\") pod \"watcher-operator-controller-manager-5db88f68c-b7tpr\" (UID: \"a565ae46-9b4e-4310-8867-230d4a65eea5\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.074891 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.075892 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl"] Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.101043 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxf5s\" (UniqueName: \"kubernetes.io/projected/1be584eb-83c6-4709-8f10-60474c6f3f93-kube-api-access-wxf5s\") pod \"telemetry-operator-controller-manager-79996fd568-n7rmk\" (UID: \"1be584eb-83c6-4709-8f10-60474c6f3f93\") " pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.101511 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4kw\" (UniqueName: \"kubernetes.io/projected/20881f3c-e228-4c63-9eb9-dbedd55ffc14-kube-api-access-qz4kw\") pod \"test-operator-controller-manager-7866795846-mzrs2\" (UID: \"20881f3c-e228-4c63-9eb9-dbedd55ffc14\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.175727 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpc9\" (UniqueName: \"kubernetes.io/projected/7afb9e38-ac2c-483f-a98e-716bfa22ee6d-kube-api-access-6cpc9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4ld6\" (UID: \"7afb9e38-ac2c-483f-a98e-716bfa22ee6d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.175776 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.175805 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgm8s\" (UniqueName: \"kubernetes.io/projected/a565ae46-9b4e-4310-8867-230d4a65eea5-kube-api-access-hgm8s\") pod \"watcher-operator-controller-manager-5db88f68c-b7tpr\" (UID: \"a565ae46-9b4e-4310-8867-230d4a65eea5\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.175856 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.175889 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p24d4\" (UniqueName: \"kubernetes.io/projected/c1a0562b-8960-485e-9c6a-da5a73a18180-kube-api-access-p24d4\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.176232 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.176294 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:21.676275169 +0000 UTC m=+1042.258776271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.176346 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.176377 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:21.676368581 +0000 UTC m=+1042.258869683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.200312 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpc9\" (UniqueName: \"kubernetes.io/projected/7afb9e38-ac2c-483f-a98e-716bfa22ee6d-kube-api-access-6cpc9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4ld6\" (UID: \"7afb9e38-ac2c-483f-a98e-716bfa22ee6d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.201669 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgm8s\" (UniqueName: \"kubernetes.io/projected/a565ae46-9b4e-4310-8867-230d4a65eea5-kube-api-access-hgm8s\") pod \"watcher-operator-controller-manager-5db88f68c-b7tpr\" (UID: \"a565ae46-9b4e-4310-8867-230d4a65eea5\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.210762 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24d4\" (UniqueName: \"kubernetes.io/projected/c1a0562b-8960-485e-9c6a-da5a73a18180-kube-api-access-p24d4\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.227629 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87"] Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.238653 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.256229 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.266814 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.320692 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.379013 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.380978 4777 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.381086 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert podName:3a482208-548a-47dc-aa40-c23cbcdf5363 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:22.381057339 +0000 UTC m=+1042.963558441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" (UID: "3a482208-548a-47dc-aa40-c23cbcdf5363") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.648493 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx"] Feb 16 21:55:21 crc kubenswrapper[4777]: W0216 21:55:21.653662 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522da514_4a99_4293_8fe8_45285b4f24eb.slice/crio-5b24c33518f84cd52ae1f3effb398b2bef5db30e946a51dba2359b85e1bdc7eb WatchSource:0}: Error finding container 5b24c33518f84cd52ae1f3effb398b2bef5db30e946a51dba2359b85e1bdc7eb: Status 404 returned error can't find the container with id 5b24c33518f84cd52ae1f3effb398b2bef5db30e946a51dba2359b85e1bdc7eb Feb 16 21:55:21 crc kubenswrapper[4777]: W0216 21:55:21.655310 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f0dee9_40e3_42d5_8651_b340e075891c.slice/crio-59df287b3178b7b5444e884e4242220e07403a2fc65b8f4430d9d707a1ee5b9b WatchSource:0}: Error finding container 59df287b3178b7b5444e884e4242220e07403a2fc65b8f4430d9d707a1ee5b9b: Status 404 returned error can't find the container with id 59df287b3178b7b5444e884e4242220e07403a2fc65b8f4430d9d707a1ee5b9b Feb 16 21:55:21 crc kubenswrapper[4777]: W0216 21:55:21.663490 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f900c19_63a0_40b9_a8c4_c51883a5087e.slice/crio-b56f833a987fd237ac3afaaced2f469f372f81fc150c32e8f1031a5c9e9d1cb6 WatchSource:0}: Error finding container b56f833a987fd237ac3afaaced2f469f372f81fc150c32e8f1031a5c9e9d1cb6: Status 404 returned error can't find the container with id b56f833a987fd237ac3afaaced2f469f372f81fc150c32e8f1031a5c9e9d1cb6 Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.665762 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5"] Feb 16 21:55:21 crc kubenswrapper[4777]: W0216 21:55:21.677176 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3269da89_1f17_44af_8ea9_83eb493249a2.slice/crio-d52675559b9009bef1c5a2066bf655039f06f25fa713ec5013e1843d0e8eec9b WatchSource:0}: Error finding container d52675559b9009bef1c5a2066bf655039f06f25fa713ec5013e1843d0e8eec9b: Status 404 returned error can't find the container with id d52675559b9009bef1c5a2066bf655039f06f25fa713ec5013e1843d0e8eec9b Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.679539 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6km9d"] Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.683653 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.683762 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.683955 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.684009 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:22.683986086 +0000 UTC m=+1043.266487178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.684051 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.684105 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:22.684063568 +0000 UTC m=+1043.266564670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.685374 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v"] Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.871756 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" event={"ID":"522da514-4a99-4293-8fe8-45285b4f24eb","Type":"ContainerStarted","Data":"5b24c33518f84cd52ae1f3effb398b2bef5db30e946a51dba2359b85e1bdc7eb"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.875127 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" event={"ID":"3269da89-1f17-44af-8ea9-83eb493249a2","Type":"ContainerStarted","Data":"d52675559b9009bef1c5a2066bf655039f06f25fa713ec5013e1843d0e8eec9b"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.877984 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" event={"ID":"c6f0dee9-40e3-42d5-8651-b340e075891c","Type":"ContainerStarted","Data":"59df287b3178b7b5444e884e4242220e07403a2fc65b8f4430d9d707a1ee5b9b"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.880607 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" event={"ID":"367fde39-a8cd-443a-83f8-f9cb0e9f3576","Type":"ContainerStarted","Data":"dc4e16cb0d4ef4768738c65c4b5559caef731fc4c7b5193b5ffaf4b53803f2cd"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.882197 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" event={"ID":"0994c976-c6c3-42e0-9b54-e09d2fac5447","Type":"ContainerStarted","Data":"a09bef8efd8f88bb6bdd874e3e1a6ec51583e1c7e01f77fed68c0b2ff74caf18"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.884774 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" event={"ID":"7f900c19-63a0-40b9-a8c4-c51883a5087e","Type":"ContainerStarted","Data":"b56f833a987fd237ac3afaaced2f469f372f81fc150c32e8f1031a5c9e9d1cb6"} Feb 16 21:55:21 crc kubenswrapper[4777]: I0216 21:55:21.885985 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.886163 4777 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:21 crc kubenswrapper[4777]: E0216 21:55:21.886221 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert podName:1d6bef33-b14f-435b-aa17-dfbed9c15d86 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:23.886203315 +0000 UTC m=+1044.468704417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert") pod "infra-operator-controller-manager-79d975b745-7shhv" (UID: "1d6bef33-b14f-435b-aa17-dfbed9c15d86") : secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.069702 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc"] Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.075771 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc853fc19_52e2_407b_ac21_638cdf255085.slice/crio-c39bf8ffb3e881dabda1e619b52e9d4e7cae0e119f3ef7267a851f7f945efbcf WatchSource:0}: Error finding container c39bf8ffb3e881dabda1e619b52e9d4e7cae0e119f3ef7267a851f7f945efbcf: Status 404 returned error can't find the container with id c39bf8ffb3e881dabda1e619b52e9d4e7cae0e119f3ef7267a851f7f945efbcf Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.079350 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.089466 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.095484 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.098112 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.102407 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6"] Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.103872 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb455e5b7_ca2c_4f12_91de_da9969199670.slice/crio-eabc154ba840676e2ddf1b854bfae545aafe962340eda390f8cb47ff7d8f32e7 WatchSource:0}: Error finding container eabc154ba840676e2ddf1b854bfae545aafe962340eda390f8cb47ff7d8f32e7: Status 404 returned error can't find the container with id eabc154ba840676e2ddf1b854bfae545aafe962340eda390f8cb47ff7d8f32e7 Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.109734 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm"] Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.112576 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fbfe46d_cddb_41eb_920a_0859f6fcbb27.slice/crio-1143c68a30e940b234bdb92c0c9bf67def0aa98a64d913cfab3c2327c0206b1b WatchSource:0}: Error finding container 1143c68a30e940b234bdb92c0c9bf67def0aa98a64d913cfab3c2327c0206b1b: Status 404 returned error can't find the container with id 1143c68a30e940b234bdb92c0c9bf67def0aa98a64d913cfab3c2327c0206b1b Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.118653 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.129175 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.132919 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6"] Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.148449 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69b97e2_c7ed_4b0a_9e29_fcd457d7a453.slice/crio-a1b1d40be18ef2b15f8051dd949e7341d280e43dea908ccb66e92a1143a1052f WatchSource:0}: Error finding container a1b1d40be18ef2b15f8051dd949e7341d280e43dea908ccb66e92a1143a1052f: Status 404 returned error can't find the container with id a1b1d40be18ef2b15f8051dd949e7341d280e43dea908ccb66e92a1143a1052f Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.149169 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.154593 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd"] Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.158101 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7afb9e38_ac2c_483f_a98e_716bfa22ee6d.slice/crio-ddc5cde9d7a22008040ab530f63d617d48e02e25d70f193389720fb418373a4e WatchSource:0}: Error finding container ddc5cde9d7a22008040ab530f63d617d48e02e25d70f193389720fb418373a4e: Status 404 returned error can't find the container with id ddc5cde9d7a22008040ab530f63d617d48e02e25d70f193389720fb418373a4e Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.176437 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd255b1d6_45e9_4749_8252_b77b97cd940a.slice/crio-0e757d02f24090c5cdb8ac609443bbc902ab85bd02783d69802567b7617ac669 WatchSource:0}: Error finding container 0e757d02f24090c5cdb8ac609443bbc902ab85bd02783d69802567b7617ac669: Status 404 returned error can't find the container with id 0e757d02f24090c5cdb8ac609443bbc902ab85bd02783d69802567b7617ac669 Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.181167 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t"] Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.198598 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mzrs2"] Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.201344 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mhxlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-g8lbd_openstack-operators(1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.201460 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-726lm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-znh8t_openstack-operators(d255b1d6-45e9-4749-8252-b77b97cd940a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.201552 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwrj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-nwzcm_openstack-operators(35deb9f9-3cf5-4f92-9ba0-5f8eac14733a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.201912 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be584eb_83c6_4709_8f10_60474c6f3f93.slice/crio-b2a04efd2f2ff27d5c1d991612c0ae7aaa4f07effe4602d1afd949b760154633 WatchSource:0}: Error finding container b2a04efd2f2ff27d5c1d991612c0ae7aaa4f07effe4602d1afd949b760154633: Status 404 returned error can't find the container with id b2a04efd2f2ff27d5c1d991612c0ae7aaa4f07effe4602d1afd949b760154633 Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.202910 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" podUID="35deb9f9-3cf5-4f92-9ba0-5f8eac14733a" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.202996 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" podUID="1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.203013 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" podUID="d255b1d6-45e9-4749-8252-b77b97cd940a" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.206246 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.102:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wxf5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-79996fd568-n7rmk_openstack-operators(1be584eb-83c6-4709-8f10-60474c6f3f93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.208415 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" podUID="1be584eb-83c6-4709-8f10-60474c6f3f93" Feb 16 21:55:22 crc kubenswrapper[4777]: W0216 21:55:22.228544 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda565ae46_9b4e_4310_8867_230d4a65eea5.slice/crio-e164afde6952ca3315a207a5f4235f07eabf31c1ebd802cce49626e6721b141f WatchSource:0}: Error finding container e164afde6952ca3315a207a5f4235f07eabf31c1ebd802cce49626e6721b141f: Status 404 returned error can't find the container with id e164afde6952ca3315a207a5f4235f07eabf31c1ebd802cce49626e6721b141f Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.232682 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qz4kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-mzrs2_openstack-operators(20881f3c-e228-4c63-9eb9-dbedd55ffc14): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.232873 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgm8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-b7tpr_openstack-operators(a565ae46-9b4e-4310-8867-230d4a65eea5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.234570 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" podUID="a565ae46-9b4e-4310-8867-230d4a65eea5" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.234615 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" podUID="20881f3c-e228-4c63-9eb9-dbedd55ffc14" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.395431 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.395644 4777 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.395726 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert podName:3a482208-548a-47dc-aa40-c23cbcdf5363 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:24.395689873 +0000 UTC m=+1044.978190975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" (UID: "3a482208-548a-47dc-aa40-c23cbcdf5363") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.699896 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.700042 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.700186 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.700231 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.700248 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:24.700229155 +0000 UTC m=+1045.282730257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.700339 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:24.700315427 +0000 UTC m=+1045.282816529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.897607 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" event={"ID":"49e70c01-9a6c-488a-a766-a430583352fb","Type":"ContainerStarted","Data":"a4cc3c8a1d66050ed11c8975a0549173e3fe1af08f95fbada4b0720702c77d5b"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.900385 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" event={"ID":"c69b97e2-c7ed-4b0a-9e29-fcd457d7a453","Type":"ContainerStarted","Data":"a1b1d40be18ef2b15f8051dd949e7341d280e43dea908ccb66e92a1143a1052f"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.902370 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" event={"ID":"7afb9e38-ac2c-483f-a98e-716bfa22ee6d","Type":"ContainerStarted","Data":"ddc5cde9d7a22008040ab530f63d617d48e02e25d70f193389720fb418373a4e"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.905213 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" event={"ID":"1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d","Type":"ContainerStarted","Data":"5af5ea9c20909dc7f83345cf8abb7a96203e5e83131b24df9c4b311c74ab6086"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.907708 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" event={"ID":"9fbfe46d-cddb-41eb-920a-0859f6fcbb27","Type":"ContainerStarted","Data":"1143c68a30e940b234bdb92c0c9bf67def0aa98a64d913cfab3c2327c0206b1b"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.911079 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" podUID="1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.911780 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" event={"ID":"b455e5b7-ca2c-4f12-91de-da9969199670","Type":"ContainerStarted","Data":"eabc154ba840676e2ddf1b854bfae545aafe962340eda390f8cb47ff7d8f32e7"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.913827 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" event={"ID":"f4b16894-e8b0-4476-94ae-c6112200fa5d","Type":"ContainerStarted","Data":"77b486cc04b372a82e1d94f5c3aed81c3e95f3c2201175330ee2acb6d02f1700"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.915664 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" event={"ID":"1be584eb-83c6-4709-8f10-60474c6f3f93","Type":"ContainerStarted","Data":"b2a04efd2f2ff27d5c1d991612c0ae7aaa4f07effe4602d1afd949b760154633"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.955116 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" event={"ID":"d255b1d6-45e9-4749-8252-b77b97cd940a","Type":"ContainerStarted","Data":"0e757d02f24090c5cdb8ac609443bbc902ab85bd02783d69802567b7617ac669"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.957295 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.102:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" podUID="1be584eb-83c6-4709-8f10-60474c6f3f93" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.957494 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" event={"ID":"c853fc19-52e2-407b-ac21-638cdf255085","Type":"ContainerStarted","Data":"c39bf8ffb3e881dabda1e619b52e9d4e7cae0e119f3ef7267a851f7f945efbcf"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.959283 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" podUID="d255b1d6-45e9-4749-8252-b77b97cd940a" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.962547 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" event={"ID":"a565ae46-9b4e-4310-8867-230d4a65eea5","Type":"ContainerStarted","Data":"e164afde6952ca3315a207a5f4235f07eabf31c1ebd802cce49626e6721b141f"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.964542 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" podUID="a565ae46-9b4e-4310-8867-230d4a65eea5" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.964672 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" event={"ID":"583dde29-d4a1-4ec2-9021-3735d6c5717b","Type":"ContainerStarted","Data":"04c76c476f092ab87c5f1ea60670920de29a812969caa9b958dee270e4875602"} Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.966942 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" event={"ID":"20881f3c-e228-4c63-9eb9-dbedd55ffc14","Type":"ContainerStarted","Data":"77c0353fb55c97394871eab8ef878ed1d6c4e1e07fb7d099abab85aa6cf02c7c"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.970890 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" podUID="20881f3c-e228-4c63-9eb9-dbedd55ffc14" Feb 16 21:55:22 crc kubenswrapper[4777]: I0216 21:55:22.975431 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" event={"ID":"35deb9f9-3cf5-4f92-9ba0-5f8eac14733a","Type":"ContainerStarted","Data":"a302c92042087ce5fa9488796d6dcf6f666965e0f9283d69c1b352a4c0ed2c73"} Feb 16 21:55:22 crc kubenswrapper[4777]: E0216 21:55:22.982965 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" podUID="35deb9f9-3cf5-4f92-9ba0-5f8eac14733a" Feb 16 21:55:23 crc kubenswrapper[4777]: I0216 21:55:23.924402 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:23 crc kubenswrapper[4777]: E0216 21:55:23.924581 4777 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:23 crc kubenswrapper[4777]: E0216 21:55:23.924652 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert podName:1d6bef33-b14f-435b-aa17-dfbed9c15d86 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:27.924633017 +0000 UTC m=+1048.507134119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert") pod "infra-operator-controller-manager-79d975b745-7shhv" (UID: "1d6bef33-b14f-435b-aa17-dfbed9c15d86") : secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.003926 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" podUID="35deb9f9-3cf5-4f92-9ba0-5f8eac14733a" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.003962 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" podUID="d255b1d6-45e9-4749-8252-b77b97cd940a" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.005070 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" podUID="a565ae46-9b4e-4310-8867-230d4a65eea5" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.005123 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" podUID="20881f3c-e228-4c63-9eb9-dbedd55ffc14" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.005478 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" podUID="1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.017195 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.102:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" podUID="1be584eb-83c6-4709-8f10-60474c6f3f93" Feb 16 21:55:24 crc kubenswrapper[4777]: I0216 21:55:24.437229 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.437523 4777 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.437633 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert podName:3a482208-548a-47dc-aa40-c23cbcdf5363 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:28.437606982 +0000 UTC m=+1049.020108084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" (UID: "3a482208-548a-47dc-aa40-c23cbcdf5363") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: I0216 21:55:24.742293 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:24 crc kubenswrapper[4777]: I0216 21:55:24.742415 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.742483 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.742564 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.742573 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:28.742546766 +0000 UTC m=+1049.325047868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:24 crc kubenswrapper[4777]: E0216 21:55:24.742651 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:28.742631808 +0000 UTC m=+1049.325132910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: I0216 21:55:28.016038 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.016292 4777 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.016598 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert podName:1d6bef33-b14f-435b-aa17-dfbed9c15d86 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:36.016559654 +0000 UTC m=+1056.599060816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert") pod "infra-operator-controller-manager-79d975b745-7shhv" (UID: "1d6bef33-b14f-435b-aa17-dfbed9c15d86") : secret "infra-operator-webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: I0216 21:55:28.526867 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.527131 4777 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.527236 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert podName:3a482208-548a-47dc-aa40-c23cbcdf5363 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:36.527211164 +0000 UTC m=+1057.109712266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" (UID: "3a482208-548a-47dc-aa40-c23cbcdf5363") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: I0216 21:55:28.830076 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:28 crc kubenswrapper[4777]: I0216 21:55:28.830150 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.830295 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.830362 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:36.830343417 +0000 UTC m=+1057.412844519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.830383 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:28 crc kubenswrapper[4777]: E0216 21:55:28.830514 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:36.830479051 +0000 UTC m=+1057.412980243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:34 crc kubenswrapper[4777]: E0216 21:55:34.218775 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 16 21:55:34 crc kubenswrapper[4777]: E0216 21:55:34.219315 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfbtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-v2m7v_openstack-operators(3269da89-1f17-44af-8ea9-83eb493249a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:34 crc kubenswrapper[4777]: E0216 21:55:34.221229 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" podUID="3269da89-1f17-44af-8ea9-83eb493249a2" Feb 16 21:55:35 crc kubenswrapper[4777]: E0216 21:55:35.087592 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 16 21:55:35 crc kubenswrapper[4777]: E0216 21:55:35.087868 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nmpjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-9lpfx_openstack-operators(c6f0dee9-40e3-42d5-8651-b340e075891c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:35 crc kubenswrapper[4777]: E0216 21:55:35.090802 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" podUID="c6f0dee9-40e3-42d5-8651-b340e075891c" Feb 16 21:55:35 crc kubenswrapper[4777]: E0216 21:55:35.123557 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" podUID="c6f0dee9-40e3-42d5-8651-b340e075891c" Feb 16 21:55:35 crc kubenswrapper[4777]: E0216 21:55:35.124614 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" podUID="3269da89-1f17-44af-8ea9-83eb493249a2" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.047788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.054266 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d6bef33-b14f-435b-aa17-dfbed9c15d86-cert\") pod \"infra-operator-controller-manager-79d975b745-7shhv\" (UID: \"1d6bef33-b14f-435b-aa17-dfbed9c15d86\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.104270 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.104498 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-69cwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-v2dbc_openstack-operators(49e70c01-9a6c-488a-a766-a430583352fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.105761 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" podUID="49e70c01-9a6c-488a-a766-a430583352fb" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.137368 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" podUID="49e70c01-9a6c-488a-a766-a430583352fb" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.254485 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.560240 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.565672 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a482208-548a-47dc-aa40-c23cbcdf5363-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs\" (UID: \"3a482208-548a-47dc-aa40-c23cbcdf5363\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.826020 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.864492 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:36 crc kubenswrapper[4777]: I0216 21:55:36.864582 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.864700 4777 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.864745 4777 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.864807 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:52.864780109 +0000 UTC m=+1073.447281211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "webhook-server-cert" not found Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.864825 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs podName:c1a0562b-8960-485e-9c6a-da5a73a18180 nodeName:}" failed. No retries permitted until 2026-02-16 21:55:52.86481891 +0000 UTC m=+1073.447320012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs") pod "openstack-operator-controller-manager-9c8f544df-6dkb7" (UID: "c1a0562b-8960-485e-9c6a-da5a73a18180") : secret "metrics-server-cert" not found Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.909270 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.909585 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4285z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-k22zl_openstack-operators(367fde39-a8cd-443a-83f8-f9cb0e9f3576): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:36 crc kubenswrapper[4777]: E0216 21:55:36.911145 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" podUID="367fde39-a8cd-443a-83f8-f9cb0e9f3576" Feb 16 21:55:37 crc kubenswrapper[4777]: E0216 21:55:37.140327 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" podUID="367fde39-a8cd-443a-83f8-f9cb0e9f3576" Feb 16 21:55:37 crc kubenswrapper[4777]: E0216 21:55:37.682481 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 16 21:55:37 crc kubenswrapper[4777]: E0216 21:55:37.682992 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krgt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-slq9f_openstack-operators(583dde29-d4a1-4ec2-9021-3735d6c5717b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:37 crc kubenswrapper[4777]: E0216 21:55:37.684604 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" podUID="583dde29-d4a1-4ec2-9021-3735d6c5717b" Feb 16 21:55:38 crc kubenswrapper[4777]: E0216 21:55:38.151590 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" podUID="583dde29-d4a1-4ec2-9021-3735d6c5717b" Feb 16 21:55:38 crc kubenswrapper[4777]: E0216 21:55:38.991343 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 16 21:55:38 crc kubenswrapper[4777]: E0216 21:55:38.991612 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6cpc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-w4ld6_openstack-operators(7afb9e38-ac2c-483f-a98e-716bfa22ee6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:38 crc kubenswrapper[4777]: E0216 21:55:38.992831 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" podUID="7afb9e38-ac2c-483f-a98e-716bfa22ee6d" Feb 16 21:55:39 crc kubenswrapper[4777]: E0216 21:55:39.157683 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" podUID="7afb9e38-ac2c-483f-a98e-716bfa22ee6d" Feb 16 21:55:39 crc kubenswrapper[4777]: E0216 21:55:39.732501 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 16 21:55:39 crc kubenswrapper[4777]: E0216 21:55:39.733230 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6bmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-mg48d_openstack-operators(c69b97e2-c7ed-4b0a-9e29-fcd457d7a453): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:39 crc kubenswrapper[4777]: E0216 21:55:39.734477 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" podUID="c69b97e2-c7ed-4b0a-9e29-fcd457d7a453" Feb 16 21:55:40 crc kubenswrapper[4777]: E0216 21:55:40.165769 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" podUID="c69b97e2-c7ed-4b0a-9e29-fcd457d7a453" Feb 16 21:55:43 crc kubenswrapper[4777]: I0216 21:55:43.273314 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-7shhv"] Feb 16 21:55:43 crc kubenswrapper[4777]: I0216 21:55:43.281894 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs"] Feb 16 21:55:43 crc kubenswrapper[4777]: W0216 21:55:43.306859 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6bef33_b14f_435b_aa17_dfbed9c15d86.slice/crio-a0cd930a1e06a8bffd29a7330bdeeb655dccb43ec0863511ae55b4e1f5aa75a8 WatchSource:0}: Error finding container a0cd930a1e06a8bffd29a7330bdeeb655dccb43ec0863511ae55b4e1f5aa75a8: Status 404 returned error can't find the container with id a0cd930a1e06a8bffd29a7330bdeeb655dccb43ec0863511ae55b4e1f5aa75a8 Feb 16 21:55:43 crc kubenswrapper[4777]: W0216 21:55:43.321899 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a482208_548a_47dc_aa40_c23cbcdf5363.slice/crio-86cfcec30f9ad7a9848ee217dbfbfd43e8577180321bf26f0b81596fcd39ab44 WatchSource:0}: Error finding container 86cfcec30f9ad7a9848ee217dbfbfd43e8577180321bf26f0b81596fcd39ab44: Status 404 returned error can't find the container with id 86cfcec30f9ad7a9848ee217dbfbfd43e8577180321bf26f0b81596fcd39ab44 Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.217050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" event={"ID":"7f900c19-63a0-40b9-a8c4-c51883a5087e","Type":"ContainerStarted","Data":"b31b265c1e40768c0c55cc6641418ab0738c06e72f869195ad4b0b187d8b7d70"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.217808 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.247299 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" event={"ID":"9fbfe46d-cddb-41eb-920a-0859f6fcbb27","Type":"ContainerStarted","Data":"6b573af99dbd3f7c26a9b5fb00d84ece930d39e64d7ef2390156d146e6c53b02"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.247348 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.259595 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" podStartSLOduration=7.194624281 podStartE2EDuration="25.259571732s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.66516502 +0000 UTC m=+1042.247666122" lastFinishedPulling="2026-02-16 21:55:39.730112471 +0000 UTC m=+1060.312613573" observedRunningTime="2026-02-16 21:55:44.25734982 +0000 UTC m=+1064.839850922" watchObservedRunningTime="2026-02-16 21:55:44.259571732 +0000 UTC m=+1064.842072834" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.279279 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" event={"ID":"20881f3c-e228-4c63-9eb9-dbedd55ffc14","Type":"ContainerStarted","Data":"776b03c9cd8f05bf5d93b680801bfc1c58bb69512f71eea5fd50d936f0b9929a"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.280142 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.297572 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" event={"ID":"d255b1d6-45e9-4749-8252-b77b97cd940a","Type":"ContainerStarted","Data":"bf364a4cf78341ebd4f1ff18566c2625eb0a020c8442f25678802a9b061f3ea4"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.298122 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.298122 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" podStartSLOduration=6.723930532 podStartE2EDuration="24.29809845s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.136242532 +0000 UTC m=+1042.718743634" lastFinishedPulling="2026-02-16 21:55:39.71041042 +0000 UTC m=+1060.292911552" observedRunningTime="2026-02-16 21:55:44.292830912 +0000 UTC m=+1064.875332014" watchObservedRunningTime="2026-02-16 21:55:44.29809845 +0000 UTC m=+1064.880599552" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.319998 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" event={"ID":"0994c976-c6c3-42e0-9b54-e09d2fac5447","Type":"ContainerStarted","Data":"d27e26082e56acf682fe9e750a14e94e7bd5c178c28e7dff5d54f5d4c9ef0519"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.320520 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.325939 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" event={"ID":"3a482208-548a-47dc-aa40-c23cbcdf5363","Type":"ContainerStarted","Data":"86cfcec30f9ad7a9848ee217dbfbfd43e8577180321bf26f0b81596fcd39ab44"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.330939 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" event={"ID":"c853fc19-52e2-407b-ac21-638cdf255085","Type":"ContainerStarted","Data":"60c28a1f07cdf41630061a61874ddedfee4d73edab0bf6b915cf12ea79ad20be"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.331809 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.333452 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" event={"ID":"a565ae46-9b4e-4310-8867-230d4a65eea5","Type":"ContainerStarted","Data":"6da77e6b88be246d48cdfed19a79bd76ce346191b505f9b85ac2217f9f59e91f"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.333954 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.334912 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" event={"ID":"1be584eb-83c6-4709-8f10-60474c6f3f93","Type":"ContainerStarted","Data":"a934e844a2f8f4f462520d74406a07c505a59fa73740b95695d4111217ffcc55"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.335354 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.336532 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" event={"ID":"1d6bef33-b14f-435b-aa17-dfbed9c15d86","Type":"ContainerStarted","Data":"a0cd930a1e06a8bffd29a7330bdeeb655dccb43ec0863511ae55b4e1f5aa75a8"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.338304 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" event={"ID":"b455e5b7-ca2c-4f12-91de-da9969199670","Type":"ContainerStarted","Data":"45c516e8d594816154673895b4189ad9a3553e3079f485ddd2a5376c4354751a"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.338950 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.339846 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" event={"ID":"f4b16894-e8b0-4476-94ae-c6112200fa5d","Type":"ContainerStarted","Data":"8937edb7d980cfb0ca548e32c342d0703e943cf02f32552b55d1b4b4552ded11"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.340660 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.355050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" event={"ID":"522da514-4a99-4293-8fe8-45285b4f24eb","Type":"ContainerStarted","Data":"f89795550218c0e23c4965c386804ccc3910aff8db49b29ac337a5f3522d6ab7"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.355562 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.358200 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" event={"ID":"1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d","Type":"ContainerStarted","Data":"616afa2bedd2c5e623a55e0dd5de0f5373cc2c90be7263ba2de8eb6e3f039cd3"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.359010 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.359921 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" event={"ID":"35deb9f9-3cf5-4f92-9ba0-5f8eac14733a","Type":"ContainerStarted","Data":"a41baeb6fbc6618118e8ccfadb4150f307782103aa6030051e09badd4f82d80c"} Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.360230 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.372462 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" podStartSLOduration=3.618624505 podStartE2EDuration="24.3724428s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.232559418 +0000 UTC m=+1042.815060520" lastFinishedPulling="2026-02-16 21:55:42.986377723 +0000 UTC m=+1063.568878815" observedRunningTime="2026-02-16 21:55:44.332724678 +0000 UTC m=+1064.915225790" watchObservedRunningTime="2026-02-16 21:55:44.3724428 +0000 UTC m=+1064.954943902" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.377289 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" podStartSLOduration=3.462931297 podStartE2EDuration="24.377272335s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.201401846 +0000 UTC m=+1042.783902948" lastFinishedPulling="2026-02-16 21:55:43.115742874 +0000 UTC m=+1063.698243986" observedRunningTime="2026-02-16 21:55:44.371917615 +0000 UTC m=+1064.954418717" watchObservedRunningTime="2026-02-16 21:55:44.377272335 +0000 UTC m=+1064.959773437" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.406263 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" podStartSLOduration=8.285794335 podStartE2EDuration="25.406243556s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.238098619 +0000 UTC m=+1041.820599721" lastFinishedPulling="2026-02-16 21:55:38.35854784 +0000 UTC m=+1058.941048942" observedRunningTime="2026-02-16 21:55:44.404338622 +0000 UTC m=+1064.986839724" watchObservedRunningTime="2026-02-16 21:55:44.406243556 +0000 UTC m=+1064.988744658" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.435699 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" podStartSLOduration=8.160811961 podStartE2EDuration="24.43567858s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.083568278 +0000 UTC m=+1042.666069380" lastFinishedPulling="2026-02-16 21:55:38.358434887 +0000 UTC m=+1058.940935999" observedRunningTime="2026-02-16 21:55:44.435073703 +0000 UTC m=+1065.017574805" watchObservedRunningTime="2026-02-16 21:55:44.43567858 +0000 UTC m=+1065.018179682" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.472388 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" podStartSLOduration=3.710646589 podStartE2EDuration="24.472367056s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.201502348 +0000 UTC m=+1042.784003450" lastFinishedPulling="2026-02-16 21:55:42.963222815 +0000 UTC m=+1063.545723917" observedRunningTime="2026-02-16 21:55:44.467056168 +0000 UTC m=+1065.049557270" watchObservedRunningTime="2026-02-16 21:55:44.472367056 +0000 UTC m=+1065.054868148" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.500154 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" podStartSLOduration=3.748449697 podStartE2EDuration="24.500133213s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.232802694 +0000 UTC m=+1042.815303796" lastFinishedPulling="2026-02-16 21:55:42.98448621 +0000 UTC m=+1063.566987312" observedRunningTime="2026-02-16 21:55:44.493496407 +0000 UTC m=+1065.075997509" watchObservedRunningTime="2026-02-16 21:55:44.500133213 +0000 UTC m=+1065.082634315" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.522579 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" podStartSLOduration=6.9257260590000005 podStartE2EDuration="24.522559081s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.11364471 +0000 UTC m=+1042.696145812" lastFinishedPulling="2026-02-16 21:55:39.710477722 +0000 UTC m=+1060.292978834" observedRunningTime="2026-02-16 21:55:44.517407077 +0000 UTC m=+1065.099908179" watchObservedRunningTime="2026-02-16 21:55:44.522559081 +0000 UTC m=+1065.105060183" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.559793 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" podStartSLOduration=5.2651053789999995 podStartE2EDuration="24.559770822s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.20119348 +0000 UTC m=+1042.783694582" lastFinishedPulling="2026-02-16 21:55:41.495858923 +0000 UTC m=+1062.078360025" observedRunningTime="2026-02-16 21:55:44.554108044 +0000 UTC m=+1065.136609146" watchObservedRunningTime="2026-02-16 21:55:44.559770822 +0000 UTC m=+1065.142271914" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.621467 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" podStartSLOduration=8.917911093 podStartE2EDuration="25.621449698s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.655464788 +0000 UTC m=+1042.237965890" lastFinishedPulling="2026-02-16 21:55:38.359003393 +0000 UTC m=+1058.941504495" observedRunningTime="2026-02-16 21:55:44.620251125 +0000 UTC m=+1065.202752227" watchObservedRunningTime="2026-02-16 21:55:44.621449698 +0000 UTC m=+1065.203950800" Feb 16 21:55:44 crc kubenswrapper[4777]: I0216 21:55:44.670460 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" podStartSLOduration=7.545815698 podStartE2EDuration="25.670440339s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.114067422 +0000 UTC m=+1042.696568514" lastFinishedPulling="2026-02-16 21:55:40.238692053 +0000 UTC m=+1060.821193155" observedRunningTime="2026-02-16 21:55:44.658641869 +0000 UTC m=+1065.241142971" watchObservedRunningTime="2026-02-16 21:55:44.670440339 +0000 UTC m=+1065.252941441" Feb 16 21:55:46 crc kubenswrapper[4777]: I0216 21:55:46.209916 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" podStartSLOduration=5.261460247 podStartE2EDuration="26.209891889s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.206113667 +0000 UTC m=+1042.788614769" lastFinishedPulling="2026-02-16 21:55:43.154545309 +0000 UTC m=+1063.737046411" observedRunningTime="2026-02-16 21:55:44.693025191 +0000 UTC m=+1065.275526283" watchObservedRunningTime="2026-02-16 21:55:46.209891889 +0000 UTC m=+1066.792392981" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.219302 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-rxd87" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.237173 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ssfj6" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.246987 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6km9d" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.373683 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-ss4k5" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.644664 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-znh8t" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.711555 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-nwzcm" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.852385 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-x2pd2" Feb 16 21:55:50 crc kubenswrapper[4777]: I0216 21:55:50.944283 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-p9lqg" Feb 16 21:55:51 crc kubenswrapper[4777]: I0216 21:55:51.020576 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-g8lbd" Feb 16 21:55:51 crc kubenswrapper[4777]: I0216 21:55:51.021492 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-k4297" Feb 16 21:55:51 crc kubenswrapper[4777]: I0216 21:55:51.242223 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-79996fd568-n7rmk" Feb 16 21:55:51 crc kubenswrapper[4777]: I0216 21:55:51.270271 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-mzrs2" Feb 16 21:55:51 crc kubenswrapper[4777]: I0216 21:55:51.274044 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-b7tpr" Feb 16 21:55:52 crc kubenswrapper[4777]: I0216 21:55:52.941997 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:52 crc kubenswrapper[4777]: I0216 21:55:52.943212 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:52 crc kubenswrapper[4777]: I0216 21:55:52.950014 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-metrics-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:52 crc kubenswrapper[4777]: I0216 21:55:52.951174 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c1a0562b-8960-485e-9c6a-da5a73a18180-webhook-certs\") pod \"openstack-operator-controller-manager-9c8f544df-6dkb7\" (UID: \"c1a0562b-8960-485e-9c6a-da5a73a18180\") " pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:53 crc kubenswrapper[4777]: I0216 21:55:53.080771 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.100562 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.101113 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkcp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79d975b745-7shhv_openstack-operators(1d6bef33-b14f-435b-aa17-dfbed9c15d86): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.102400 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" podUID="1d6bef33-b14f-435b-aa17-dfbed9c15d86" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.468746 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" podUID="1d6bef33-b14f-435b-aa17-dfbed9c15d86" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.832283 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.839743 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xghs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs_openstack-operators(3a482208-548a-47dc-aa40-c23cbcdf5363): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:55:55 crc kubenswrapper[4777]: E0216 21:55:55.841266 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" podUID="3a482208-548a-47dc-aa40-c23cbcdf5363" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.433014 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7"] Feb 16 21:55:56 crc kubenswrapper[4777]: W0216 21:55:56.438009 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a0562b_8960_485e_9c6a_da5a73a18180.slice/crio-dbda8702e12fa5ddeaf5d7b524728053666c2eada573b934a7700ff7b281530b WatchSource:0}: Error finding container dbda8702e12fa5ddeaf5d7b524728053666c2eada573b934a7700ff7b281530b: Status 404 returned error can't find the container with id dbda8702e12fa5ddeaf5d7b524728053666c2eada573b934a7700ff7b281530b Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.474092 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" event={"ID":"c6f0dee9-40e3-42d5-8651-b340e075891c","Type":"ContainerStarted","Data":"2671a11f1a62e4572155a77122bf2f9eaea3d9089188b985ffca7c342293be22"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.474271 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.476210 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" event={"ID":"c69b97e2-c7ed-4b0a-9e29-fcd457d7a453","Type":"ContainerStarted","Data":"0c4d5d75e01caec67b06c53804928c42459756237ba90088b26df72f149547f5"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.476355 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.480248 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" event={"ID":"367fde39-a8cd-443a-83f8-f9cb0e9f3576","Type":"ContainerStarted","Data":"b30fc489b0c7ec3c1c06723dfc93a969cf78682b43dae70787e798501e1b9310"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.480456 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.481577 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" event={"ID":"7afb9e38-ac2c-483f-a98e-716bfa22ee6d","Type":"ContainerStarted","Data":"e48560dc0f8ca8989b42b5d6d04417dfe1378af54586269e38c4df87f178b600"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.482610 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" event={"ID":"49e70c01-9a6c-488a-a766-a430583352fb","Type":"ContainerStarted","Data":"c936667b5fb930604f25bb96def08a7de29e5a5944cc6b676ce1daca71ff35a1"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.482743 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.486361 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" event={"ID":"3269da89-1f17-44af-8ea9-83eb493249a2","Type":"ContainerStarted","Data":"a84a83231977343520799bd5c8af63cdd6987a2d117ed27304a0c1d84feb147f"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.486691 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.490550 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" event={"ID":"c1a0562b-8960-485e-9c6a-da5a73a18180","Type":"ContainerStarted","Data":"dbda8702e12fa5ddeaf5d7b524728053666c2eada573b934a7700ff7b281530b"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.491924 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" event={"ID":"583dde29-d4a1-4ec2-9021-3735d6c5717b","Type":"ContainerStarted","Data":"2e8c20d45915ee25fda778d7e280835525ea1a34e962d5afe6724c249885c187"} Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.492199 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:55:56 crc kubenswrapper[4777]: E0216 21:55:56.492890 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e6f7c2a75883f63d270378b283faeee4c4c14fbd74b509c7da82621166f07b24\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" podUID="3a482208-548a-47dc-aa40-c23cbcdf5363" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.494834 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" podStartSLOduration=3.146610143 podStartE2EDuration="37.494817938s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.659877532 +0000 UTC m=+1042.242378634" lastFinishedPulling="2026-02-16 21:55:56.008085327 +0000 UTC m=+1076.590586429" observedRunningTime="2026-02-16 21:55:56.490492317 +0000 UTC m=+1077.072993439" watchObservedRunningTime="2026-02-16 21:55:56.494817938 +0000 UTC m=+1077.077319040" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.546930 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" podStartSLOduration=2.669085354 podStartE2EDuration="36.546915066s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.154083982 +0000 UTC m=+1042.736585084" lastFinishedPulling="2026-02-16 21:55:56.031913654 +0000 UTC m=+1076.614414796" observedRunningTime="2026-02-16 21:55:56.527030369 +0000 UTC m=+1077.109531471" watchObservedRunningTime="2026-02-16 21:55:56.546915066 +0000 UTC m=+1077.129416168" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.547430 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" podStartSLOduration=2.751536806 podStartE2EDuration="37.54742453s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.235973539 +0000 UTC m=+1041.818474641" lastFinishedPulling="2026-02-16 21:55:56.031861243 +0000 UTC m=+1076.614362365" observedRunningTime="2026-02-16 21:55:56.542744599 +0000 UTC m=+1077.125245691" watchObservedRunningTime="2026-02-16 21:55:56.54742453 +0000 UTC m=+1077.129925632" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.563691 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" podStartSLOduration=3.225660695 podStartE2EDuration="37.563676435s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:21.682870525 +0000 UTC m=+1042.265371627" lastFinishedPulling="2026-02-16 21:55:56.020886225 +0000 UTC m=+1076.603387367" observedRunningTime="2026-02-16 21:55:56.560022263 +0000 UTC m=+1077.142523365" watchObservedRunningTime="2026-02-16 21:55:56.563676435 +0000 UTC m=+1077.146177537" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.605420 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4ld6" podStartSLOduration=2.7507652 podStartE2EDuration="36.605403193s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.178102924 +0000 UTC m=+1042.760604026" lastFinishedPulling="2026-02-16 21:55:56.032740907 +0000 UTC m=+1076.615242019" observedRunningTime="2026-02-16 21:55:56.581921256 +0000 UTC m=+1077.164422348" watchObservedRunningTime="2026-02-16 21:55:56.605403193 +0000 UTC m=+1077.187904295" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.606330 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" podStartSLOduration=2.655551364 podStartE2EDuration="36.606323738s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.081566372 +0000 UTC m=+1042.664067474" lastFinishedPulling="2026-02-16 21:55:56.032338706 +0000 UTC m=+1076.614839848" observedRunningTime="2026-02-16 21:55:56.59923573 +0000 UTC m=+1077.181736832" watchObservedRunningTime="2026-02-16 21:55:56.606323738 +0000 UTC m=+1077.188824840" Feb 16 21:55:56 crc kubenswrapper[4777]: I0216 21:55:56.622520 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" podStartSLOduration=3.702847728 podStartE2EDuration="37.622502511s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:22.112672423 +0000 UTC m=+1042.695173525" lastFinishedPulling="2026-02-16 21:55:56.032327196 +0000 UTC m=+1076.614828308" observedRunningTime="2026-02-16 21:55:56.619843827 +0000 UTC m=+1077.202344929" watchObservedRunningTime="2026-02-16 21:55:56.622502511 +0000 UTC m=+1077.205003613" Feb 16 21:55:57 crc kubenswrapper[4777]: I0216 21:55:57.517164 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" event={"ID":"c1a0562b-8960-485e-9c6a-da5a73a18180","Type":"ContainerStarted","Data":"4e3788e185631a889f0e1bfb2585c6758192858e61517363e7f988f9d683cf3a"} Feb 16 21:55:57 crc kubenswrapper[4777]: I0216 21:55:57.517439 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:56:03 crc kubenswrapper[4777]: I0216 21:56:03.091009 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" Feb 16 21:56:03 crc kubenswrapper[4777]: I0216 21:56:03.133011 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-9c8f544df-6dkb7" podStartSLOduration=43.132978965 podStartE2EDuration="43.132978965s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:55:57.55981416 +0000 UTC m=+1078.142315292" watchObservedRunningTime="2026-02-16 21:56:03.132978965 +0000 UTC m=+1083.715480107" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.197203 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k22zl" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.293021 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-v2m7v" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.505929 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-9lpfx" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.560046 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-slq9f" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.616775 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-v2dbc" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.631480 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" event={"ID":"1d6bef33-b14f-435b-aa17-dfbed9c15d86","Type":"ContainerStarted","Data":"91488e65eb79f859a41a04f4763af59c5f4d3e1ee47e1913c8ccbda6e4efe3a9"} Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.632011 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.660353 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" podStartSLOduration=25.313985304 podStartE2EDuration="51.660333897s" podCreationTimestamp="2026-02-16 21:55:19 +0000 UTC" firstStartedPulling="2026-02-16 21:55:43.313883308 +0000 UTC m=+1063.896384400" lastFinishedPulling="2026-02-16 21:56:09.660231861 +0000 UTC m=+1090.242732993" observedRunningTime="2026-02-16 21:56:10.655998066 +0000 UTC m=+1091.238499178" watchObservedRunningTime="2026-02-16 21:56:10.660333897 +0000 UTC m=+1091.242834999" Feb 16 21:56:10 crc kubenswrapper[4777]: I0216 21:56:10.848700 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-mg48d" Feb 16 21:56:11 crc kubenswrapper[4777]: I0216 21:56:11.651373 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:56:11 crc kubenswrapper[4777]: I0216 21:56:11.651827 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:56:13 crc kubenswrapper[4777]: I0216 21:56:13.656929 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" event={"ID":"3a482208-548a-47dc-aa40-c23cbcdf5363","Type":"ContainerStarted","Data":"051ead4674f9e8c295bb3ac7634a337c6a9e06ff289c7abce6d4d9fd98622bc4"} Feb 16 21:56:13 crc kubenswrapper[4777]: I0216 21:56:13.657621 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:56:13 crc kubenswrapper[4777]: I0216 21:56:13.701299 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" podStartSLOduration=24.33853024 podStartE2EDuration="53.701272113s" podCreationTimestamp="2026-02-16 21:55:20 +0000 UTC" firstStartedPulling="2026-02-16 21:55:43.325894384 +0000 UTC m=+1063.908395486" lastFinishedPulling="2026-02-16 21:56:12.688636257 +0000 UTC m=+1093.271137359" observedRunningTime="2026-02-16 21:56:13.695260395 +0000 UTC m=+1094.277761537" watchObservedRunningTime="2026-02-16 21:56:13.701272113 +0000 UTC m=+1094.283773255" Feb 16 21:56:16 crc kubenswrapper[4777]: I0216 21:56:16.263965 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-7shhv" Feb 16 21:56:26 crc kubenswrapper[4777]: I0216 21:56:26.835557 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs" Feb 16 21:56:41 crc kubenswrapper[4777]: I0216 21:56:41.651676 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:56:41 crc kubenswrapper[4777]: I0216 21:56:41.652346 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.025323 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.052030 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.053461 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.057841 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.058037 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.058174 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.058322 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ldwpg" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.084110 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.085484 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.094170 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.109779 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.186147 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28vdh\" (UniqueName: \"kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.186408 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.186436 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.186452 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27d7n\" (UniqueName: \"kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.186474 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.287557 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28vdh\" (UniqueName: \"kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.287638 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.287690 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.287728 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27d7n\" (UniqueName: \"kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.287759 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.288480 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.288825 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.288966 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.306248 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27d7n\" (UniqueName: \"kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n\") pod \"dnsmasq-dns-78dd6ddcc-w627v\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.306874 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28vdh\" (UniqueName: \"kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh\") pod \"dnsmasq-dns-675f4bcbfc-4lpvk\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.380667 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.411606 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.847602 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.933109 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:56:44 crc kubenswrapper[4777]: I0216 21:56:44.939831 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" event={"ID":"911a3e03-f85d-469f-8058-734e8c49c1c3","Type":"ContainerStarted","Data":"a82b5b11d456a7f668a4efd31a4728f1f7ac3d9d778c4d32ddd35eac5d7b1100"} Feb 16 21:56:45 crc kubenswrapper[4777]: I0216 21:56:45.948411 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" event={"ID":"062563f7-55fc-4191-b518-96f17b4e5bf2","Type":"ContainerStarted","Data":"24fdff7710539be482bee6d19c4183f781843f2fe98d5597be6ebfb143f06d21"} Feb 16 21:56:46 crc kubenswrapper[4777]: I0216 21:56:46.823950 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:56:46 crc kubenswrapper[4777]: I0216 21:56:46.898607 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:56:46 crc kubenswrapper[4777]: I0216 21:56:46.899826 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:46 crc kubenswrapper[4777]: I0216 21:56:46.915475 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.043795 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc6cz\" (UniqueName: \"kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.043861 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.043891 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.144891 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc6cz\" (UniqueName: \"kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.145253 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.145296 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.146169 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.146230 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.161605 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.173649 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc6cz\" (UniqueName: \"kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz\") pod \"dnsmasq-dns-666b6646f7-xlk24\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.190793 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.192043 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.197012 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.235909 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.247683 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.247829 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9zw4\" (UniqueName: \"kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.247889 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.349281 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.349356 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9zw4\" (UniqueName: \"kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.349397 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.350176 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.350682 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.372702 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9zw4\" (UniqueName: \"kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4\") pod \"dnsmasq-dns-57d769cc4f-n77h7\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.511321 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.713329 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:56:47 crc kubenswrapper[4777]: W0216 21:56:47.717057 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd06db55b_62f4_447a_9a65_10aebdf5957e.slice/crio-3694b6a2a125c97877f205be50c75832b20779abd449263bf378f88aa36763f8 WatchSource:0}: Error finding container 3694b6a2a125c97877f205be50c75832b20779abd449263bf378f88aa36763f8: Status 404 returned error can't find the container with id 3694b6a2a125c97877f205be50c75832b20779abd449263bf378f88aa36763f8 Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.914806 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.968242 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" event={"ID":"d06db55b-62f4-447a-9a65-10aebdf5957e","Type":"ContainerStarted","Data":"3694b6a2a125c97877f205be50c75832b20779abd449263bf378f88aa36763f8"} Feb 16 21:56:47 crc kubenswrapper[4777]: I0216 21:56:47.969505 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" event={"ID":"150daccc-a84b-457f-8685-468d3b017302","Type":"ContainerStarted","Data":"e783b3b5cd3a3192cec9876457ccaa3c35add34b06bfaea87cc891ad24ef3b0b"} Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.013292 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.014450 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.024465 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.024518 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.024673 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.024811 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.024480 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.026391 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.027140 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l8cgx" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.046324 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161263 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161388 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-config-data\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161462 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161480 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4df600f3-97e1-4ac5-980b-2c42aecc5e81-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161557 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4df600f3-97e1-4ac5-980b-2c42aecc5e81-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161582 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161595 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161614 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161645 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161811 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gswkm\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-kube-api-access-gswkm\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.161872 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02479261-644e-4693-ae9e-302141cca692\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02479261-644e-4693-ae9e-302141cca692\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266160 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266256 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-config-data\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266308 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266322 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4df600f3-97e1-4ac5-980b-2c42aecc5e81-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266462 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4df600f3-97e1-4ac5-980b-2c42aecc5e81-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266481 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266495 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266533 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266591 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266641 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gswkm\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-kube-api-access-gswkm\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.266671 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02479261-644e-4693-ae9e-302141cca692\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02479261-644e-4693-ae9e-302141cca692\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.268195 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.268492 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.269346 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-config-data\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.269555 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.271140 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4df600f3-97e1-4ac5-980b-2c42aecc5e81-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.272581 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4df600f3-97e1-4ac5-980b-2c42aecc5e81-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.272743 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.272772 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02479261-644e-4693-ae9e-302141cca692\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02479261-644e-4693-ae9e-302141cca692\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7ad1950883ac085fe8acd60a9adc91f510d0532f0b226f075d5d013f5b37fcab/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.272865 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.273005 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.276439 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4df600f3-97e1-4ac5-980b-2c42aecc5e81-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.285739 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gswkm\" (UniqueName: \"kubernetes.io/projected/4df600f3-97e1-4ac5-980b-2c42aecc5e81-kube-api-access-gswkm\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.303681 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.306016 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.306459 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02479261-644e-4693-ae9e-302141cca692\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02479261-644e-4693-ae9e-302141cca692\") pod \"rabbitmq-server-0\" (UID: \"4df600f3-97e1-4ac5-980b-2c42aecc5e81\") " pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.308314 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.309553 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.323245 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.323474 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gk2ps" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.323634 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.323870 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.324013 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.337558 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.350056 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.371775 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.371918 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.371940 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvwv\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-kube-api-access-fgvwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.371980 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae3d5d10-a124-407a-80fd-f6d134548078\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae3d5d10-a124-407a-80fd-f6d134548078\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372004 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372097 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9346e4ab-e1d7-42a9-8817-850a2f84e57d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372121 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372262 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372291 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372311 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.372359 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9346e4ab-e1d7-42a9-8817-850a2f84e57d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473767 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473829 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473852 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473916 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9346e4ab-e1d7-42a9-8817-850a2f84e57d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473940 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473970 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvwv\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-kube-api-access-fgvwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.473987 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474010 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae3d5d10-a124-407a-80fd-f6d134548078\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae3d5d10-a124-407a-80fd-f6d134548078\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474028 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474058 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9346e4ab-e1d7-42a9-8817-850a2f84e57d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474077 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474118 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474625 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.474642 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.475005 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.475514 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9346e4ab-e1d7-42a9-8817-850a2f84e57d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.478009 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.478056 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae3d5d10-a124-407a-80fd-f6d134548078\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae3d5d10-a124-407a-80fd-f6d134548078\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cb855367ebf72660182324d748bd964d419b719bd9e4b02bdeff087a37ee3be2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.478240 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.486753 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9346e4ab-e1d7-42a9-8817-850a2f84e57d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.486864 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.490293 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9346e4ab-e1d7-42a9-8817-850a2f84e57d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.491423 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvwv\" (UniqueName: \"kubernetes.io/projected/9346e4ab-e1d7-42a9-8817-850a2f84e57d-kube-api-access-fgvwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.516772 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae3d5d10-a124-407a-80fd-f6d134548078\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae3d5d10-a124-407a-80fd-f6d134548078\") pod \"rabbitmq-cell1-server-0\" (UID: \"9346e4ab-e1d7-42a9-8817-850a2f84e57d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:48 crc kubenswrapper[4777]: I0216 21:56:48.680749 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.461243 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.464114 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.473392 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.474143 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wjnct" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.474242 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.475176 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.499742 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.499773 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599486 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599563 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599597 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599625 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599639 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599672 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599691 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h7t2\" (UniqueName: \"kubernetes.io/projected/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kube-api-access-8h7t2\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.599731 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701468 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701546 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701588 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701615 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701643 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701659 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701679 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.701698 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h7t2\" (UniqueName: \"kubernetes.io/projected/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kube-api-access-8h7t2\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.702236 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.703901 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.704314 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.710623 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.722099 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.722357 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h7t2\" (UniqueName: \"kubernetes.io/projected/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-kube-api-access-8h7t2\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.723065 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aadef7bb-2bff-4cd3-9a56-6b42ca417c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.723868 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.723890 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff109b698e9657ffc8935024939261a886f7d9e667c3080c1bdfa86719b3ce4f/globalmount\"" pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.776053 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-144e80aa-7f4f-4100-b0ef-0c1e602af763\") pod \"openstack-galera-0\" (UID: \"aadef7bb-2bff-4cd3-9a56-6b42ca417c94\") " pod="openstack/openstack-galera-0" Feb 16 21:56:49 crc kubenswrapper[4777]: I0216 21:56:49.854729 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.066411 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.067964 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.070969 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.071036 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.073790 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-89crf" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.073991 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.109544 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.122909 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.122994 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123069 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123113 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hkt\" (UniqueName: \"kubernetes.io/projected/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kube-api-access-z6hkt\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123282 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123355 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123393 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.123448 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.162336 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.163566 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.169538 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.169771 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-89p5b" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.169907 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.187380 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.225633 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.225705 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.225788 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.225813 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.225898 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226003 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hkt\" (UniqueName: \"kubernetes.io/projected/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kube-api-access-z6hkt\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226084 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-config-data\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226255 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226310 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226333 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226373 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226414 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-kolla-config\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.226475 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b64w7\" (UniqueName: \"kubernetes.io/projected/fa6dd43f-e212-439a-b078-4a0a5c1db760-kube-api-access-b64w7\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.227471 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.227863 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.228051 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.229406 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.237919 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.238082 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/820af0fe893b19c722bbfb750e099ef0354d909ccdd40366eaff469c1137521b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.241911 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hkt\" (UniqueName: \"kubernetes.io/projected/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-kube-api-access-z6hkt\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.255467 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.273152 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46a9a087-6dae-439d-bca2-c59ed9a9444a\") pod \"openstack-cell1-galera-0\" (UID: \"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a\") " pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.329967 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-kolla-config\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.330035 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b64w7\" (UniqueName: \"kubernetes.io/projected/fa6dd43f-e212-439a-b078-4a0a5c1db760-kube-api-access-b64w7\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.330094 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.330114 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.330159 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-config-data\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.332102 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-kolla-config\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.335855 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa6dd43f-e212-439a-b078-4a0a5c1db760-config-data\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.341200 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.342235 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6dd43f-e212-439a-b078-4a0a5c1db760-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.345395 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b64w7\" (UniqueName: \"kubernetes.io/projected/fa6dd43f-e212-439a-b078-4a0a5c1db760-kube-api-access-b64w7\") pod \"memcached-0\" (UID: \"fa6dd43f-e212-439a-b078-4a0a5c1db760\") " pod="openstack/memcached-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.419390 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 16 21:56:51 crc kubenswrapper[4777]: I0216 21:56:51.484374 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.357067 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.359003 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.362436 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-m7pfc" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.384055 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.389828 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7697\" (UniqueName: \"kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697\") pod \"kube-state-metrics-0\" (UID: \"0eaee64e-a445-4d25-9781-d7067c0841f8\") " pod="openstack/kube-state-metrics-0" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.491536 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7697\" (UniqueName: \"kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697\") pod \"kube-state-metrics-0\" (UID: \"0eaee64e-a445-4d25-9781-d7067c0841f8\") " pod="openstack/kube-state-metrics-0" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.514349 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7697\" (UniqueName: \"kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697\") pod \"kube-state-metrics-0\" (UID: \"0eaee64e-a445-4d25-9781-d7067c0841f8\") " pod="openstack/kube-state-metrics-0" Feb 16 21:56:53 crc kubenswrapper[4777]: I0216 21:56:53.685934 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.364841 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.367084 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.370941 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.370990 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.371195 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.371327 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.371856 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-xbtcj" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417486 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417533 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417579 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417602 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417641 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417854 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpnx\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-kube-api-access-ztpnx\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.417943 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.444996 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.519744 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.519992 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.520100 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.520172 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.520252 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.520339 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztpnx\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-kube-api-access-ztpnx\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.520427 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.521484 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.527343 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.527525 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.528145 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.535046 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.537546 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztpnx\" (UniqueName: \"kubernetes.io/projected/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-kube-api-access-ztpnx\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.556476 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/29233f83-ffa4-4cbf-bc7e-435e4b44cd5e-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e\") " pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.683927 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.734875 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.737040 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.743444 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.743634 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.745208 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nxvh4" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.745429 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.745590 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.745811 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.746001 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.746463 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.750063 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.826930 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827001 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827026 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827070 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvkm\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827100 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827127 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827197 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827241 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827275 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.827307 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929018 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929076 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929101 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929126 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvkm\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929149 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929173 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929230 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929259 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929281 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.929307 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.931459 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.931524 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.931633 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.942602 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.942792 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.942854 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.943597 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.945542 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvkm\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.945793 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.945820 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d7e1ae79a57ec286f5c0df29629ccbb0e6df636c5e8b94fad20c5c32a47e117a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.949577 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:54 crc kubenswrapper[4777]: I0216 21:56:54.985141 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:55 crc kubenswrapper[4777]: I0216 21:56:55.063703 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:56:55 crc kubenswrapper[4777]: I0216 21:56:55.848386 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.547271 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5qdr"] Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.548831 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.553502 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-qkttz" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.553770 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.560040 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.566115 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr"] Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.575039 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7wbfs"] Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.578288 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.585790 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7wbfs"] Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702093 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-combined-ca-bundle\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702175 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-etc-ovs\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702193 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-ovn-controller-tls-certs\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702221 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-log\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702242 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-log-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702280 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-run\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702319 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75pb\" (UniqueName: \"kubernetes.io/projected/129ef04f-890a-41e2-936b-833a227993e5-kube-api-access-v75pb\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702408 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-lib\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702429 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbtf\" (UniqueName: \"kubernetes.io/projected/7f871748-de9b-43dc-87df-728b12402b6b-kube-api-access-7rbtf\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702474 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702493 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/129ef04f-890a-41e2-936b-833a227993e5-scripts\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702538 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f871748-de9b-43dc-87df-728b12402b6b-scripts\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.702568 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803613 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-lib\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803658 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rbtf\" (UniqueName: \"kubernetes.io/projected/7f871748-de9b-43dc-87df-728b12402b6b-kube-api-access-7rbtf\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803690 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803707 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/129ef04f-890a-41e2-936b-833a227993e5-scripts\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803752 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f871748-de9b-43dc-87df-728b12402b6b-scripts\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803772 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803798 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-combined-ca-bundle\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803823 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-etc-ovs\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803838 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-ovn-controller-tls-certs\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803858 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-log\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803875 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-log-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803898 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-run\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.803924 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75pb\" (UniqueName: \"kubernetes.io/projected/129ef04f-890a-41e2-936b-833a227993e5-kube-api-access-v75pb\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.804367 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-etc-ovs\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.804380 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-lib\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.805013 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.805098 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-run\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.805261 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-log-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.805282 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/129ef04f-890a-41e2-936b-833a227993e5-var-log\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.805361 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f871748-de9b-43dc-87df-728b12402b6b-var-run-ovn\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.806255 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/129ef04f-890a-41e2-936b-833a227993e5-scripts\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.806926 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f871748-de9b-43dc-87df-728b12402b6b-scripts\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.812237 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-ovn-controller-tls-certs\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.816591 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f871748-de9b-43dc-87df-728b12402b6b-combined-ca-bundle\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.823167 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rbtf\" (UniqueName: \"kubernetes.io/projected/7f871748-de9b-43dc-87df-728b12402b6b-kube-api-access-7rbtf\") pod \"ovn-controller-q5qdr\" (UID: \"7f871748-de9b-43dc-87df-728b12402b6b\") " pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.836379 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75pb\" (UniqueName: \"kubernetes.io/projected/129ef04f-890a-41e2-936b-833a227993e5-kube-api-access-v75pb\") pod \"ovn-controller-ovs-7wbfs\" (UID: \"129ef04f-890a-41e2-936b-833a227993e5\") " pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.880309 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr" Feb 16 21:56:57 crc kubenswrapper[4777]: I0216 21:56:57.901828 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.975171 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.979609 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.983658 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-msb46" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.985756 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.985816 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.986463 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 16 21:57:00 crc kubenswrapper[4777]: I0216 21:57:00.986780 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.018279 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061125 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061202 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061245 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061297 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061323 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbkr9\" (UniqueName: \"kubernetes.io/projected/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-kube-api-access-jbkr9\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061356 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061414 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.061439 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.162854 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.162908 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.162948 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.162991 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.163032 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.163089 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.163118 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbkr9\" (UniqueName: \"kubernetes.io/projected/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-kube-api-access-jbkr9\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.163245 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.163810 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.164305 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-config\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.167527 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.167566 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/729565c1cc24096af412c589e4a4c0f0f791ce7af191e9c4f61b2347d59e3e50/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.168662 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.170068 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.170306 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.171948 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.175019 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.175052 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.175241 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.175708 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sg5hj" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.176106 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.179085 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.192457 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbkr9\" (UniqueName: \"kubernetes.io/projected/62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7-kube-api-access-jbkr9\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.193700 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.234425 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9907d9cc-20ab-4d77-8743-795b468bb47c\") pod \"ovsdbserver-nb-0\" (UID: \"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7\") " pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264365 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264482 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264570 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264621 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264837 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264885 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.264958 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trf8\" (UniqueName: \"kubernetes.io/projected/79b13da6-0857-40ec-9c96-5a5a28c6dd69-kube-api-access-5trf8\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.265013 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-config\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.337172 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.366622 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trf8\" (UniqueName: \"kubernetes.io/projected/79b13da6-0857-40ec-9c96-5a5a28c6dd69-kube-api-access-5trf8\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.366706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-config\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.366836 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.367605 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.367687 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.367752 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.367897 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.367943 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.368158 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.369536 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-config\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.371324 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.371341 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b13da6-0857-40ec-9c96-5a5a28c6dd69-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.371523 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.372779 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.372808 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fa792dd29cb1463d4de841edc4b38c61356b5fd00da6214dfec47f0a45f7e6ab/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.376804 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b13da6-0857-40ec-9c96-5a5a28c6dd69-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.383619 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trf8\" (UniqueName: \"kubernetes.io/projected/79b13da6-0857-40ec-9c96-5a5a28c6dd69-kube-api-access-5trf8\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.404229 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2ea00cf-bfaf-428f-a1c0-ea5403a2dc67\") pod \"ovsdbserver-sb-0\" (UID: \"79b13da6-0857-40ec-9c96-5a5a28c6dd69\") " pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: I0216 21:57:01.539587 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:01 crc kubenswrapper[4777]: W0216 21:57:01.784299 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaadef7bb_2bff_4cd3_9a56_6b42ca417c94.slice/crio-ad57de2cc69da902474ef4117c8f142225a33f34629eed2d0fa9337dffc66091 WatchSource:0}: Error finding container ad57de2cc69da902474ef4117c8f142225a33f34629eed2d0fa9337dffc66091: Status 404 returned error can't find the container with id ad57de2cc69da902474ef4117c8f142225a33f34629eed2d0fa9337dffc66091 Feb 16 21:57:02 crc kubenswrapper[4777]: I0216 21:57:02.088363 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aadef7bb-2bff-4cd3-9a56-6b42ca417c94","Type":"ContainerStarted","Data":"ad57de2cc69da902474ef4117c8f142225a33f34629eed2d0fa9337dffc66091"} Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.703330 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.703498 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27d7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-w627v_openstack(062563f7-55fc-4191-b518-96f17b4e5bf2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.704593 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" podUID="062563f7-55fc-4191-b518-96f17b4e5bf2" Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.760064 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.760212 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-28vdh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4lpvk_openstack(911a3e03-f85d-469f-8058-734e8c49c1c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:57:02 crc kubenswrapper[4777]: E0216 21:57:02.761392 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" podUID="911a3e03-f85d-469f-8058-734e8c49c1c3" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.142834 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.246116 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.526006 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: W0216 21:57:03.546165 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf52bd37b_edb4_4fcb_aee8_1e1c828b9c5a.slice/crio-a04a4a84babfef0080fdde33c81bdd9040a477154f8767e82199506f38c1aa38 WatchSource:0}: Error finding container a04a4a84babfef0080fdde33c81bdd9040a477154f8767e82199506f38c1aa38: Status 404 returned error can't find the container with id a04a4a84babfef0080fdde33c81bdd9040a477154f8767e82199506f38c1aa38 Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.553127 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: W0216 21:57:03.557954 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa6dd43f_e212_439a_b078_4a0a5c1db760.slice/crio-a10d25b2355cdc477146a66c97a96af301094dcf7b3ecb872e1cfa29ba1e6ccf WatchSource:0}: Error finding container a10d25b2355cdc477146a66c97a96af301094dcf7b3ecb872e1cfa29ba1e6ccf: Status 404 returned error can't find the container with id a10d25b2355cdc477146a66c97a96af301094dcf7b3ecb872e1cfa29ba1e6ccf Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.560472 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: W0216 21:57:03.574175 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eaee64e_a445_4d25_9781_d7067c0841f8.slice/crio-9c21113bf82f86335dda62a1e5deddf695b1ab52f0f55bc93caf86b942759cf0 WatchSource:0}: Error finding container 9c21113bf82f86335dda62a1e5deddf695b1ab52f0f55bc93caf86b942759cf0: Status 404 returned error can't find the container with id 9c21113bf82f86335dda62a1e5deddf695b1ab52f0f55bc93caf86b942759cf0 Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.848985 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.855460 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.925188 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc\") pod \"062563f7-55fc-4191-b518-96f17b4e5bf2\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.925235 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config\") pod \"911a3e03-f85d-469f-8058-734e8c49c1c3\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.925395 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config\") pod \"062563f7-55fc-4191-b518-96f17b4e5bf2\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.925446 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28vdh\" (UniqueName: \"kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh\") pod \"911a3e03-f85d-469f-8058-734e8c49c1c3\" (UID: \"911a3e03-f85d-469f-8058-734e8c49c1c3\") " Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.925482 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27d7n\" (UniqueName: \"kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n\") pod \"062563f7-55fc-4191-b518-96f17b4e5bf2\" (UID: \"062563f7-55fc-4191-b518-96f17b4e5bf2\") " Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.926013 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config" (OuterVolumeSpecName: "config") pod "911a3e03-f85d-469f-8058-734e8c49c1c3" (UID: "911a3e03-f85d-469f-8058-734e8c49c1c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.926381 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config" (OuterVolumeSpecName: "config") pod "062563f7-55fc-4191-b518-96f17b4e5bf2" (UID: "062563f7-55fc-4191-b518-96f17b4e5bf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.926503 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "062563f7-55fc-4191-b518-96f17b4e5bf2" (UID: "062563f7-55fc-4191-b518-96f17b4e5bf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.927569 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.927665 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911a3e03-f85d-469f-8058-734e8c49c1c3-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.927749 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062563f7-55fc-4191-b518-96f17b4e5bf2-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.933277 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n" (OuterVolumeSpecName: "kube-api-access-27d7n") pod "062563f7-55fc-4191-b518-96f17b4e5bf2" (UID: "062563f7-55fc-4191-b518-96f17b4e5bf2"). InnerVolumeSpecName "kube-api-access-27d7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.941466 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.950997 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh" (OuterVolumeSpecName: "kube-api-access-28vdh") pod "911a3e03-f85d-469f-8058-734e8c49c1c3" (UID: "911a3e03-f85d-469f-8058-734e8c49c1c3"). InnerVolumeSpecName "kube-api-access-28vdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.951341 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr"] Feb 16 21:57:03 crc kubenswrapper[4777]: I0216 21:57:03.960854 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.029742 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28vdh\" (UniqueName: \"kubernetes.io/projected/911a3e03-f85d-469f-8058-734e8c49c1c3-kube-api-access-28vdh\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.029773 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27d7n\" (UniqueName: \"kubernetes.io/projected/062563f7-55fc-4191-b518-96f17b4e5bf2-kube-api-access-27d7n\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.064142 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 16 21:57:04 crc kubenswrapper[4777]: W0216 21:57:04.067826 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62bafcaf_aae9_46cf_bff3_3cf0f78ce7a7.slice/crio-dcdcf4e99bfd1aab948dee0a6887155a9ae1690ab990a2076c51992b211ee8ed WatchSource:0}: Error finding container dcdcf4e99bfd1aab948dee0a6887155a9ae1690ab990a2076c51992b211ee8ed: Status 404 returned error can't find the container with id dcdcf4e99bfd1aab948dee0a6887155a9ae1690ab990a2076c51992b211ee8ed Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.103496 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7","Type":"ContainerStarted","Data":"dcdcf4e99bfd1aab948dee0a6887155a9ae1690ab990a2076c51992b211ee8ed"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.105254 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" event={"ID":"062563f7-55fc-4191-b518-96f17b4e5bf2","Type":"ContainerDied","Data":"24fdff7710539be482bee6d19c4183f781843f2fe98d5597be6ebfb143f06d21"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.105312 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w627v" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.110878 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9346e4ab-e1d7-42a9-8817-850a2f84e57d","Type":"ContainerStarted","Data":"d968d8c4873c6a53e35a03c33ca5a783d9f54624037193a52d433b1446712db4"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.112884 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerStarted","Data":"0dec1232a0f60ed5380da83440d8733f4662b9c87536123ce65092d54ebd85c6"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.114323 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0eaee64e-a445-4d25-9781-d7067c0841f8","Type":"ContainerStarted","Data":"9c21113bf82f86335dda62a1e5deddf695b1ab52f0f55bc93caf86b942759cf0"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.115588 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4df600f3-97e1-4ac5-980b-2c42aecc5e81","Type":"ContainerStarted","Data":"167005589fed694e8a6f21453685187ebdb25ba67d95b6aa3ec243244d921325"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.116946 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" event={"ID":"911a3e03-f85d-469f-8058-734e8c49c1c3","Type":"ContainerDied","Data":"a82b5b11d456a7f668a4efd31a4728f1f7ac3d9d778c4d32ddd35eac5d7b1100"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.116966 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4lpvk" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.118345 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr" event={"ID":"7f871748-de9b-43dc-87df-728b12402b6b","Type":"ContainerStarted","Data":"6263c855e220bdda9e8d2e729ac6770eb53c12a724acc6dd020557a9c2f32732"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.120729 4777 generic.go:334] "Generic (PLEG): container finished" podID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerID="b97aa9249307926bf3345047c1c834922e6f7a8d53d2a1a34ff74a8ddb74b440" exitCode=0 Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.120795 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" event={"ID":"d06db55b-62f4-447a-9a65-10aebdf5957e","Type":"ContainerDied","Data":"b97aa9249307926bf3345047c1c834922e6f7a8d53d2a1a34ff74a8ddb74b440"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.122929 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e","Type":"ContainerStarted","Data":"837c013b91d5b9999495426c064da8eb8743da33ec431dfa05ff32ae6399a460"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.132239 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a","Type":"ContainerStarted","Data":"a04a4a84babfef0080fdde33c81bdd9040a477154f8767e82199506f38c1aa38"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.143214 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fa6dd43f-e212-439a-b078-4a0a5c1db760","Type":"ContainerStarted","Data":"a10d25b2355cdc477146a66c97a96af301094dcf7b3ecb872e1cfa29ba1e6ccf"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.164215 4777 generic.go:334] "Generic (PLEG): container finished" podID="150daccc-a84b-457f-8685-468d3b017302" containerID="bcbc16120135eeb57db1c4f6be780013a3ffd0a1e492e6de5f840dbdfe5f2507" exitCode=0 Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.164259 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" event={"ID":"150daccc-a84b-457f-8685-468d3b017302","Type":"ContainerDied","Data":"bcbc16120135eeb57db1c4f6be780013a3ffd0a1e492e6de5f840dbdfe5f2507"} Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.265183 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.269295 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w627v"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.307169 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.325188 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4lpvk"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.792117 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.900081 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7"] Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.901017 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.904129 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-8d2r8" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.904503 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.905542 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.908231 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.908231 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 16 21:57:04 crc kubenswrapper[4777]: I0216 21:57:04.912403 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.049991 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.050264 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntmh\" (UniqueName: \"kubernetes.io/projected/6bd47bb8-e12f-4cb4-a343-8f732e339484-kube-api-access-cntmh\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.050304 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.050387 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.050526 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.073059 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.079791 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.082213 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.082372 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.082476 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.107527 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.120849 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7wbfs"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.151582 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cntmh\" (UniqueName: \"kubernetes.io/projected/6bd47bb8-e12f-4cb4-a343-8f732e339484-kube-api-access-cntmh\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.151621 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.151658 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.151702 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.151754 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.152583 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.153206 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.155686 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.157503 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd47bb8-e12f-4cb4-a343-8f732e339484-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.159042 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.164998 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.181789 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.189656 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntmh\" (UniqueName: \"kubernetes.io/projected/6bd47bb8-e12f-4cb4-a343-8f732e339484-kube-api-access-cntmh\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.194875 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6bd47bb8-e12f-4cb4-a343-8f732e339484-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-jzwq7\" (UID: \"6bd47bb8-e12f-4cb4-a343-8f732e339484\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.212629 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.223822 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" event={"ID":"150daccc-a84b-457f-8685-468d3b017302","Type":"ContainerStarted","Data":"361656afb488844efa84b835e661a92e001020b9017a597d9cb5058a39ff063c"} Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.223860 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.240699 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.241008 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" podStartSLOduration=3.158708116 podStartE2EDuration="18.240988867s" podCreationTimestamp="2026-02-16 21:56:47 +0000 UTC" firstStartedPulling="2026-02-16 21:56:47.910975954 +0000 UTC m=+1128.493477056" lastFinishedPulling="2026-02-16 21:57:02.993256705 +0000 UTC m=+1143.575757807" observedRunningTime="2026-02-16 21:57:05.240491653 +0000 UTC m=+1145.822992755" watchObservedRunningTime="2026-02-16 21:57:05.240988867 +0000 UTC m=+1145.823489969" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.251265 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" event={"ID":"d06db55b-62f4-447a-9a65-10aebdf5957e","Type":"ContainerStarted","Data":"431e52afb94c44a0473a067a3c24d17c1ce2ba92da291359bd6e0f6cf2df5737"} Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.252193 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.252949 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrkf\" (UniqueName: \"kubernetes.io/projected/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-kube-api-access-bsrkf\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.252978 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.252997 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253023 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253041 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253071 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253087 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253103 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpmp\" (UniqueName: \"kubernetes.io/projected/be9c8434-a8d2-4404-ad47-b9b91b21f439-kube-api-access-sxpmp\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253137 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253153 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.253207 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.281766 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.282872 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.290344 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.291007 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.291188 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.291290 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.291392 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.291490 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.295597 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.344706 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.346833 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.349913 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-pv2hs" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.350930 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" podStartSLOduration=4.170155716 podStartE2EDuration="19.350906725s" podCreationTimestamp="2026-02-16 21:56:46 +0000 UTC" firstStartedPulling="2026-02-16 21:56:47.722868135 +0000 UTC m=+1128.305369237" lastFinishedPulling="2026-02-16 21:57:02.903619144 +0000 UTC m=+1143.486120246" observedRunningTime="2026-02-16 21:57:05.289037663 +0000 UTC m=+1145.871538765" watchObservedRunningTime="2026-02-16 21:57:05.350906725 +0000 UTC m=+1145.933407837" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355524 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355568 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355759 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrkf\" (UniqueName: \"kubernetes.io/projected/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-kube-api-access-bsrkf\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355787 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355812 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355866 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355929 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355950 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.355971 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpmp\" (UniqueName: \"kubernetes.io/projected/be9c8434-a8d2-4404-ad47-b9b91b21f439-kube-api-access-sxpmp\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.366799 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.367059 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.368802 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.369486 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be9c8434-a8d2-4404-ad47-b9b91b21f439-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.371462 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.371926 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.371969 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.378148 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/be9c8434-a8d2-4404-ad47-b9b91b21f439-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.382324 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.387138 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrkf\" (UniqueName: \"kubernetes.io/projected/dd2cb1a1-9d03-4791-bf4d-c93f79816e59-kube-api-access-bsrkf\") pod \"cloudkitty-lokistack-querier-58c84b5844-7jdcb\" (UID: \"dd2cb1a1-9d03-4791-bf4d-c93f79816e59\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.388826 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpmp\" (UniqueName: \"kubernetes.io/projected/be9c8434-a8d2-4404-ad47-b9b91b21f439-kube-api-access-sxpmp\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8\" (UID: \"be9c8434-a8d2-4404-ad47-b9b91b21f439\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.402348 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82"] Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.432869 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459352 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459406 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459431 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459453 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459472 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459625 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459661 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdsl\" (UniqueName: \"kubernetes.io/projected/1d115cc4-a668-4cd2-b4ae-f812341416a6-kube-api-access-qvdsl\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459708 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgfz4\" (UniqueName: \"kubernetes.io/projected/a24ca457-99d2-4651-9220-cfca7e58df3e-kube-api-access-tgfz4\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459739 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459771 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459852 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459878 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459895 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459917 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459936 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.459980 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.460000 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.460020 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: W0216 21:57:05.505664 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b13da6_0857_40ec_9c96_5a5a28c6dd69.slice/crio-9934304c2934f31900d4da595fed6e6869075c25c6eb9fd3da5ebd6966bb1b32 WatchSource:0}: Error finding container 9934304c2934f31900d4da595fed6e6869075c25c6eb9fd3da5ebd6966bb1b32: Status 404 returned error can't find the container with id 9934304c2934f31900d4da595fed6e6869075c25c6eb9fd3da5ebd6966bb1b32 Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.556572 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.561913 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.561958 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.562000 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.562021 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.563092 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.563794 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.567644 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.568809 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.568859 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.568879 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.568998 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdsl\" (UniqueName: \"kubernetes.io/projected/1d115cc4-a668-4cd2-b4ae-f812341416a6-kube-api-access-qvdsl\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569119 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgfz4\" (UniqueName: \"kubernetes.io/projected/a24ca457-99d2-4651-9220-cfca7e58df3e-kube-api-access-tgfz4\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569143 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569228 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569273 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569297 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.569316 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.572878 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.570339 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.571652 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.572015 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.570319 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.577387 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.582944 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.583069 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.583094 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.583117 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.584854 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.585216 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.585273 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a24ca457-99d2-4651-9220-cfca7e58df3e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.585859 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d115cc4-a668-4cd2-b4ae-f812341416a6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.587152 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1d115cc4-a668-4cd2-b4ae-f812341416a6-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.587408 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.588224 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgfz4\" (UniqueName: \"kubernetes.io/projected/a24ca457-99d2-4651-9220-cfca7e58df3e-kube-api-access-tgfz4\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.589958 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a24ca457-99d2-4651-9220-cfca7e58df3e-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-vsxfm\" (UID: \"a24ca457-99d2-4651-9220-cfca7e58df3e\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.592771 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdsl\" (UniqueName: \"kubernetes.io/projected/1d115cc4-a668-4cd2-b4ae-f812341416a6-kube-api-access-qvdsl\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-8hh82\" (UID: \"1d115cc4-a668-4cd2-b4ae-f812341416a6\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.606749 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:05 crc kubenswrapper[4777]: I0216 21:57:05.737879 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.040567 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.041556 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.047028 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.047247 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.052116 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.132944 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.134897 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.138371 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.139067 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.158657 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.194844 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.194893 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.194920 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.194957 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.194981 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.195012 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxwp\" (UniqueName: \"kubernetes.io/projected/a882b0c3-7f2e-446b-aea4-476cacffb112-kube-api-access-cqxwp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.195040 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.195116 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.205387 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062563f7-55fc-4191-b518-96f17b4e5bf2" path="/var/lib/kubelet/pods/062563f7-55fc-4191-b518-96f17b4e5bf2/volumes" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.206179 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911a3e03-f85d-469f-8058-734e8c49c1c3" path="/var/lib/kubelet/pods/911a3e03-f85d-469f-8058-734e8c49c1c3/volumes" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.227609 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.229082 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.231871 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.232548 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.263468 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.265625 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wbfs" event={"ID":"129ef04f-890a-41e2-936b-833a227993e5","Type":"ContainerStarted","Data":"458d885085247932bdabcb9fb12ba717eddacb3b4fbc3d42fada0d3d14b728f4"} Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.267203 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79b13da6-0857-40ec-9c96-5a5a28c6dd69","Type":"ContainerStarted","Data":"9934304c2934f31900d4da595fed6e6869075c25c6eb9fd3da5ebd6966bb1b32"} Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297142 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297187 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297228 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxwp\" (UniqueName: \"kubernetes.io/projected/a882b0c3-7f2e-446b-aea4-476cacffb112-kube-api-access-cqxwp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297264 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297288 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297330 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297415 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297460 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297482 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297511 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297543 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fscq\" (UniqueName: \"kubernetes.io/projected/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-kube-api-access-8fscq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297563 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297584 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297611 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.297638 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.298163 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.298269 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.299265 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.299490 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.303302 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.303328 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.304140 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a882b0c3-7f2e-446b-aea4-476cacffb112-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.314452 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxwp\" (UniqueName: \"kubernetes.io/projected/a882b0c3-7f2e-446b-aea4-476cacffb112-kube-api-access-cqxwp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.326565 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.329109 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a882b0c3-7f2e-446b-aea4-476cacffb112\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.389098 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.399067 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.399158 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.399332 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400027 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400168 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fscq\" (UniqueName: \"kubernetes.io/projected/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-kube-api-access-8fscq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400193 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400241 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400264 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400295 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400357 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgd8\" (UniqueName: \"kubernetes.io/projected/c2c4ae45-a8be-4fce-a939-53cf29c33a77-kube-api-access-nvgd8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400399 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400430 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400451 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400497 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400522 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.400998 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.403652 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.404563 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.404643 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.406841 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.419216 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fscq\" (UniqueName: \"kubernetes.io/projected/ae4b796b-cd3f-4a97-a43f-9fec28e71ac7-kube-api-access-8fscq\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.424837 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.457673 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501681 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501725 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501754 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgd8\" (UniqueName: \"kubernetes.io/projected/c2c4ae45-a8be-4fce-a939-53cf29c33a77-kube-api-access-nvgd8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501826 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501853 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.501894 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.502346 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.502658 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.503368 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4ae45-a8be-4fce-a939-53cf29c33a77-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.505607 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.506187 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.507562 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/c2c4ae45-a8be-4fce-a939-53cf29c33a77-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.517726 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgd8\" (UniqueName: \"kubernetes.io/projected/c2c4ae45-a8be-4fce-a939-53cf29c33a77-kube-api-access-nvgd8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.522635 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"c2c4ae45-a8be-4fce-a939-53cf29c33a77\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:06 crc kubenswrapper[4777]: I0216 21:57:06.563775 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:11 crc kubenswrapper[4777]: I0216 21:57:11.651750 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:57:11 crc kubenswrapper[4777]: I0216 21:57:11.652505 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:57:11 crc kubenswrapper[4777]: I0216 21:57:11.652570 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 21:57:11 crc kubenswrapper[4777]: I0216 21:57:11.653417 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 21:57:11 crc kubenswrapper[4777]: I0216 21:57:11.653596 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f" gracePeriod=600 Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.238897 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.343353 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f" exitCode=0 Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.345042 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f"} Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.345156 4777 scope.go:117] "RemoveContainer" containerID="7bcaca13099ae0d1981555aa9ad645633be960905be53ca0bdd76ed67ab8a1a2" Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.512899 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.568421 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:57:12 crc kubenswrapper[4777]: I0216 21:57:12.568662 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="dnsmasq-dns" containerID="cri-o://431e52afb94c44a0473a067a3c24d17c1ce2ba92da291359bd6e0f6cf2df5737" gracePeriod=10 Feb 16 21:57:13 crc kubenswrapper[4777]: I0216 21:57:13.356157 4777 generic.go:334] "Generic (PLEG): container finished" podID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerID="431e52afb94c44a0473a067a3c24d17c1ce2ba92da291359bd6e0f6cf2df5737" exitCode=0 Feb 16 21:57:13 crc kubenswrapper[4777]: I0216 21:57:13.356590 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" event={"ID":"d06db55b-62f4-447a-9a65-10aebdf5957e","Type":"ContainerDied","Data":"431e52afb94c44a0473a067a3c24d17c1ce2ba92da291359bd6e0f6cf2df5737"} Feb 16 21:57:13 crc kubenswrapper[4777]: I0216 21:57:13.997200 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.012229 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.026594 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.152826 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc6cz\" (UniqueName: \"kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz\") pod \"d06db55b-62f4-447a-9a65-10aebdf5957e\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.153369 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config\") pod \"d06db55b-62f4-447a-9a65-10aebdf5957e\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.153431 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc\") pod \"d06db55b-62f4-447a-9a65-10aebdf5957e\" (UID: \"d06db55b-62f4-447a-9a65-10aebdf5957e\") " Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.161783 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz" (OuterVolumeSpecName: "kube-api-access-sc6cz") pod "d06db55b-62f4-447a-9a65-10aebdf5957e" (UID: "d06db55b-62f4-447a-9a65-10aebdf5957e"). InnerVolumeSpecName "kube-api-access-sc6cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.196482 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config" (OuterVolumeSpecName: "config") pod "d06db55b-62f4-447a-9a65-10aebdf5957e" (UID: "d06db55b-62f4-447a-9a65-10aebdf5957e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.199196 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d06db55b-62f4-447a-9a65-10aebdf5957e" (UID: "d06db55b-62f4-447a-9a65-10aebdf5957e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.199739 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.255768 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc6cz\" (UniqueName: \"kubernetes.io/projected/d06db55b-62f4-447a-9a65-10aebdf5957e-kube-api-access-sc6cz\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.255805 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.255815 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06db55b-62f4-447a-9a65-10aebdf5957e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.375290 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fa6dd43f-e212-439a-b078-4a0a5c1db760","Type":"ContainerStarted","Data":"94ad7ae72d0dcc70cd3516a63b7072f7d4705bea9cffddbf6087fbc4f29440a5"} Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.375624 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.379598 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" event={"ID":"d06db55b-62f4-447a-9a65-10aebdf5957e","Type":"ContainerDied","Data":"3694b6a2a125c97877f205be50c75832b20779abd449263bf378f88aa36763f8"} Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.379637 4777 scope.go:117] "RemoveContainer" containerID="431e52afb94c44a0473a067a3c24d17c1ce2ba92da291359bd6e0f6cf2df5737" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.379729 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xlk24" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.389189 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.397773 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd"} Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.413142 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.420134 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.424290 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.247525049 podStartE2EDuration="23.424276954s" podCreationTimestamp="2026-02-16 21:56:51 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.56189057 +0000 UTC m=+1144.144391672" lastFinishedPulling="2026-02-16 21:57:12.738642465 +0000 UTC m=+1153.321143577" observedRunningTime="2026-02-16 21:57:14.391007522 +0000 UTC m=+1154.973508634" watchObservedRunningTime="2026-02-16 21:57:14.424276954 +0000 UTC m=+1155.006778056" Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.433647 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.439802 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.452961 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.459591 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xlk24"] Feb 16 21:57:14 crc kubenswrapper[4777]: W0216 21:57:14.539949 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bd47bb8_e12f_4cb4_a343_8f732e339484.slice/crio-ff0613e7f733f856b302790307ee1f69d032439f5b3c3677b7dc21971340a6bb WatchSource:0}: Error finding container ff0613e7f733f856b302790307ee1f69d032439f5b3c3677b7dc21971340a6bb: Status 404 returned error can't find the container with id ff0613e7f733f856b302790307ee1f69d032439f5b3c3677b7dc21971340a6bb Feb 16 21:57:14 crc kubenswrapper[4777]: W0216 21:57:14.553640 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd2cb1a1_9d03_4791_bf4d_c93f79816e59.slice/crio-5687093bdd9fddd139a6a060320df00022bd1fee1d355b55bd441cf4fa1bd1f6 WatchSource:0}: Error finding container 5687093bdd9fddd139a6a060320df00022bd1fee1d355b55bd441cf4fa1bd1f6: Status 404 returned error can't find the container with id 5687093bdd9fddd139a6a060320df00022bd1fee1d355b55bd441cf4fa1bd1f6 Feb 16 21:57:14 crc kubenswrapper[4777]: I0216 21:57:14.574236 4777 scope.go:117] "RemoveContainer" containerID="b97aa9249307926bf3345047c1c834922e6f7a8d53d2a1a34ff74a8ddb74b440" Feb 16 21:57:14 crc kubenswrapper[4777]: W0216 21:57:14.582076 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae4b796b_cd3f_4a97_a43f_9fec28e71ac7.slice/crio-e92dbf0f057f6651359c53a7d21228d9cc26c7ea170bf38752ec51bb73154dfe WatchSource:0}: Error finding container e92dbf0f057f6651359c53a7d21228d9cc26c7ea170bf38752ec51bb73154dfe: Status 404 returned error can't find the container with id e92dbf0f057f6651359c53a7d21228d9cc26c7ea170bf38752ec51bb73154dfe Feb 16 21:57:14 crc kubenswrapper[4777]: W0216 21:57:14.582434 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe9c8434_a8d2_4404_ad47_b9b91b21f439.slice/crio-0c249d84484fb10e04a1384027260228ffa08336d43fc2865f780b957fd6c45e WatchSource:0}: Error finding container 0c249d84484fb10e04a1384027260228ffa08336d43fc2865f780b957fd6c45e: Status 404 returned error can't find the container with id 0c249d84484fb10e04a1384027260228ffa08336d43fc2865f780b957fd6c45e Feb 16 21:57:14 crc kubenswrapper[4777]: W0216 21:57:14.588816 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda24ca457_99d2_4651_9220_cfca7e58df3e.slice/crio-63fad4ed28a1314a167a0c65192028906e7d7b6d73dd8819a688303f11ac6653 WatchSource:0}: Error finding container 63fad4ed28a1314a167a0c65192028906e7d7b6d73dd8819a688303f11ac6653: Status 404 returned error can't find the container with id 63fad4ed28a1314a167a0c65192028906e7d7b6d73dd8819a688303f11ac6653 Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.413376 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aadef7bb-2bff-4cd3-9a56-6b42ca417c94","Type":"ContainerStarted","Data":"3612329325ef8db7d4660e5364a7ff7b10180056ebebc09e823cd18bb581a20d"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.416033 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" event={"ID":"a24ca457-99d2-4651-9220-cfca7e58df3e","Type":"ContainerStarted","Data":"63fad4ed28a1314a167a0c65192028906e7d7b6d73dd8819a688303f11ac6653"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.417702 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7","Type":"ContainerStarted","Data":"e92dbf0f057f6651359c53a7d21228d9cc26c7ea170bf38752ec51bb73154dfe"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.419200 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" event={"ID":"dd2cb1a1-9d03-4791-bf4d-c93f79816e59","Type":"ContainerStarted","Data":"5687093bdd9fddd139a6a060320df00022bd1fee1d355b55bd441cf4fa1bd1f6"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.420315 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" event={"ID":"1d115cc4-a668-4cd2-b4ae-f812341416a6","Type":"ContainerStarted","Data":"a838406ae38e6cd16156e925d9a03569be7c230817c235e545f346cd1e006878"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.423301 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" event={"ID":"6bd47bb8-e12f-4cb4-a343-8f732e339484","Type":"ContainerStarted","Data":"ff0613e7f733f856b302790307ee1f69d032439f5b3c3677b7dc21971340a6bb"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.424909 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a882b0c3-7f2e-446b-aea4-476cacffb112","Type":"ContainerStarted","Data":"9248dbe10ffa200fae81ae804b94747b4be2fdbfee4b5eef3e78c3740f5ba2b1"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.427694 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"c2c4ae45-a8be-4fce-a939-53cf29c33a77","Type":"ContainerStarted","Data":"3cdec04c994d3516076ebe83c3a653788f8de1b433cf3a209572331d7ab970d3"} Feb 16 21:57:15 crc kubenswrapper[4777]: I0216 21:57:15.430539 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" event={"ID":"be9c8434-a8d2-4404-ad47-b9b91b21f439","Type":"ContainerStarted","Data":"0c249d84484fb10e04a1384027260228ffa08336d43fc2865f780b957fd6c45e"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.193602 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" path="/var/lib/kubelet/pods/d06db55b-62f4-447a-9a65-10aebdf5957e/volumes" Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.441880 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7","Type":"ContainerStarted","Data":"60182fd04e9d5476cc2a1c9c51c0af9f4232536d3c4f6f8cab5800e0ecb79a21"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.443781 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9346e4ab-e1d7-42a9-8817-850a2f84e57d","Type":"ContainerStarted","Data":"7e00e91aac82460ebf3e6980e1ea18e4e936c85a9c532c3705eb436689217fbd"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.446848 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0eaee64e-a445-4d25-9781-d7067c0841f8","Type":"ContainerStarted","Data":"99cb9b349c1b4a155720a37143ddf67bf7fdbc6c6432d14310d058fdd1954bd8"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.447083 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.457050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4df600f3-97e1-4ac5-980b-2c42aecc5e81","Type":"ContainerStarted","Data":"7f88e90d4a34122af31a7965e83e3daa52c8942cc4703caf9b8d53ede467bef3"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.463621 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79b13da6-0857-40ec-9c96-5a5a28c6dd69","Type":"ContainerStarted","Data":"28a68a2ab515125d773b09355414fe2c0f365096096fe091e2bd8ea82faeb90e"} Feb 16 21:57:16 crc kubenswrapper[4777]: I0216 21:57:16.484663 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.321263765 podStartE2EDuration="23.484645779s" podCreationTimestamp="2026-02-16 21:56:53 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.578562737 +0000 UTC m=+1144.161063839" lastFinishedPulling="2026-02-16 21:57:14.741944751 +0000 UTC m=+1155.324445853" observedRunningTime="2026-02-16 21:57:16.480870504 +0000 UTC m=+1157.063371606" watchObservedRunningTime="2026-02-16 21:57:16.484645779 +0000 UTC m=+1157.067146871" Feb 16 21:57:17 crc kubenswrapper[4777]: I0216 21:57:17.469894 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e","Type":"ContainerStarted","Data":"3132e517c597825426765712331e8508d23ba170b6cf4a8adbe791e532abe261"} Feb 16 21:57:17 crc kubenswrapper[4777]: I0216 21:57:17.472550 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wbfs" event={"ID":"129ef04f-890a-41e2-936b-833a227993e5","Type":"ContainerStarted","Data":"d98895eb745b2d85cf26c56248a0318a0a6baf162466e26cae064aeef6eec07d"} Feb 16 21:57:17 crc kubenswrapper[4777]: I0216 21:57:17.474350 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr" event={"ID":"7f871748-de9b-43dc-87df-728b12402b6b","Type":"ContainerStarted","Data":"d3319ef1b2328f29cf22f52a8ed2952dfc672b2f6b8ce42862afe75b09479f68"} Feb 16 21:57:17 crc kubenswrapper[4777]: I0216 21:57:17.476237 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a","Type":"ContainerStarted","Data":"80721e328f2cfaf682ece60ef998bcdb69a64b13a282e49a9b490a8242423fa4"} Feb 16 21:57:18 crc kubenswrapper[4777]: I0216 21:57:18.486854 4777 generic.go:334] "Generic (PLEG): container finished" podID="129ef04f-890a-41e2-936b-833a227993e5" containerID="d98895eb745b2d85cf26c56248a0318a0a6baf162466e26cae064aeef6eec07d" exitCode=0 Feb 16 21:57:18 crc kubenswrapper[4777]: I0216 21:57:18.486920 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wbfs" event={"ID":"129ef04f-890a-41e2-936b-833a227993e5","Type":"ContainerDied","Data":"d98895eb745b2d85cf26c56248a0318a0a6baf162466e26cae064aeef6eec07d"} Feb 16 21:57:18 crc kubenswrapper[4777]: I0216 21:57:18.489186 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerStarted","Data":"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033"} Feb 16 21:57:18 crc kubenswrapper[4777]: I0216 21:57:18.522649 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q5qdr" podStartSLOduration=11.719232731 podStartE2EDuration="21.522631787s" podCreationTimestamp="2026-02-16 21:56:57 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.913427785 +0000 UTC m=+1144.495928887" lastFinishedPulling="2026-02-16 21:57:13.716826841 +0000 UTC m=+1154.299327943" observedRunningTime="2026-02-16 21:57:18.518663026 +0000 UTC m=+1159.101164138" watchObservedRunningTime="2026-02-16 21:57:18.522631787 +0000 UTC m=+1159.105132889" Feb 16 21:57:19 crc kubenswrapper[4777]: I0216 21:57:19.503759 4777 generic.go:334] "Generic (PLEG): container finished" podID="aadef7bb-2bff-4cd3-9a56-6b42ca417c94" containerID="3612329325ef8db7d4660e5364a7ff7b10180056ebebc09e823cd18bb581a20d" exitCode=0 Feb 16 21:57:19 crc kubenswrapper[4777]: I0216 21:57:19.504523 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aadef7bb-2bff-4cd3-9a56-6b42ca417c94","Type":"ContainerDied","Data":"3612329325ef8db7d4660e5364a7ff7b10180056ebebc09e823cd18bb581a20d"} Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.825825 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9zcrw"] Feb 16 21:57:20 crc kubenswrapper[4777]: E0216 21:57:20.829297 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="dnsmasq-dns" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.829317 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="dnsmasq-dns" Feb 16 21:57:20 crc kubenswrapper[4777]: E0216 21:57:20.829344 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="init" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.829350 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="init" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.831883 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06db55b-62f4-447a-9a65-10aebdf5957e" containerName="dnsmasq-dns" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.832683 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.849779 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.859311 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9zcrw"] Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878205 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovs-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878245 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13df94da-db3f-470b-ab61-f8ad3a4dd75d-config\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878289 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovn-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878389 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-combined-ca-bundle\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878457 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.878515 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fgh\" (UniqueName: \"kubernetes.io/projected/13df94da-db3f-470b-ab61-f8ad3a4dd75d-kube-api-access-98fgh\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.962390 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zbr9z"] Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.963776 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.966700 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981270 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fgh\" (UniqueName: \"kubernetes.io/projected/13df94da-db3f-470b-ab61-f8ad3a4dd75d-kube-api-access-98fgh\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981512 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovs-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981530 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13df94da-db3f-470b-ab61-f8ad3a4dd75d-config\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981558 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovn-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981627 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-combined-ca-bundle\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.981666 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.982346 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovs-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.982396 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/13df94da-db3f-470b-ab61-f8ad3a4dd75d-ovn-rundir\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.982448 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13df94da-db3f-470b-ab61-f8ad3a4dd75d-config\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.984332 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zbr9z"] Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.986485 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:20 crc kubenswrapper[4777]: I0216 21:57:20.988113 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df94da-db3f-470b-ab61-f8ad3a4dd75d-combined-ca-bundle\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.012886 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fgh\" (UniqueName: \"kubernetes.io/projected/13df94da-db3f-470b-ab61-f8ad3a4dd75d-kube-api-access-98fgh\") pod \"ovn-controller-metrics-9zcrw\" (UID: \"13df94da-db3f-470b-ab61-f8ad3a4dd75d\") " pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.036728 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9zcrw" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.090996 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.091190 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.091290 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.091355 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shq96\" (UniqueName: \"kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.156868 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zbr9z"] Feb 16 21:57:21 crc kubenswrapper[4777]: E0216 21:57:21.157469 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-shq96 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" podUID="dcab01c9-7bc6-4dad-9515-bc9807edfca0" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.180093 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.182212 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.187073 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.192600 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shq96\" (UniqueName: \"kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.192651 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.192730 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.192780 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.193440 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.193441 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.193676 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.201654 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.218195 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shq96\" (UniqueName: \"kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96\") pod \"dnsmasq-dns-7fd796d7df-zbr9z\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.294742 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.295134 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hfm\" (UniqueName: \"kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.295194 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.295276 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.295381 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397002 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397074 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397148 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397169 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hfm\" (UniqueName: \"kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397188 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.397969 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.398456 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.399070 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.399356 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.417353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hfm\" (UniqueName: \"kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm\") pod \"dnsmasq-dns-86db49b7ff-wsh5p\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.485993 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.526958 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.557825 4777 generic.go:334] "Generic (PLEG): container finished" podID="f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a" containerID="80721e328f2cfaf682ece60ef998bcdb69a64b13a282e49a9b490a8242423fa4" exitCode=0 Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.557915 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a","Type":"ContainerDied","Data":"80721e328f2cfaf682ece60ef998bcdb69a64b13a282e49a9b490a8242423fa4"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.570174 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a882b0c3-7f2e-446b-aea4-476cacffb112","Type":"ContainerStarted","Data":"761ac94184c9ac3b9edd12528f0b075c33d4523fb7a4e74bc4dab0e232c00807"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.570966 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.575859 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wbfs" event={"ID":"129ef04f-890a-41e2-936b-833a227993e5","Type":"ContainerStarted","Data":"6ea413cb851b1a36a39f3e4659fb3de610a087e7baf088969c784ab92f76f1b4"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.576873 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" event={"ID":"dd2cb1a1-9d03-4791-bf4d-c93f79816e59","Type":"ContainerStarted","Data":"e23fbbf9addc73f946879e11da1b95eb135ffcc61dd31f2e7021d1ed554386da"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.577157 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.578457 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aadef7bb-2bff-4cd3-9a56-6b42ca417c94","Type":"ContainerStarted","Data":"cac54649d6821e575fe0095527587dab968ee3ba8d2e5e99f50920efbb86c2a8"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.580352 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.581239 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" event={"ID":"6bd47bb8-e12f-4cb4-a343-8f732e339484","Type":"ContainerStarted","Data":"052f6c16bd1fa87a82d63796d59a288552aa6293cf23c4f51e499c94a93b1bf2"} Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.581264 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.608995 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.629055 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" podStartSLOduration=10.537810891 podStartE2EDuration="16.629041089s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.556067035 +0000 UTC m=+1155.138568177" lastFinishedPulling="2026-02-16 21:57:20.647297273 +0000 UTC m=+1161.229798375" observedRunningTime="2026-02-16 21:57:21.628197765 +0000 UTC m=+1162.210698867" watchObservedRunningTime="2026-02-16 21:57:21.629041089 +0000 UTC m=+1162.211542191" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.679791 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9zcrw"] Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.688548 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=10.544781526 podStartE2EDuration="16.688530185s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.580600732 +0000 UTC m=+1155.163101834" lastFinishedPulling="2026-02-16 21:57:20.724349391 +0000 UTC m=+1161.306850493" observedRunningTime="2026-02-16 21:57:21.666629721 +0000 UTC m=+1162.249130823" watchObservedRunningTime="2026-02-16 21:57:21.688530185 +0000 UTC m=+1162.271031287" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.701804 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.198042319 podStartE2EDuration="33.701788596s" podCreationTimestamp="2026-02-16 21:56:48 +0000 UTC" firstStartedPulling="2026-02-16 21:57:01.792354641 +0000 UTC m=+1142.374855753" lastFinishedPulling="2026-02-16 21:57:13.296100918 +0000 UTC m=+1153.878602030" observedRunningTime="2026-02-16 21:57:21.70015262 +0000 UTC m=+1162.282653722" watchObservedRunningTime="2026-02-16 21:57:21.701788596 +0000 UTC m=+1162.284289698" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.707285 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config\") pod \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.707340 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb\") pod \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.707422 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shq96\" (UniqueName: \"kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96\") pod \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.707532 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc\") pod \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\" (UID: \"dcab01c9-7bc6-4dad-9515-bc9807edfca0\") " Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.709635 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config" (OuterVolumeSpecName: "config") pod "dcab01c9-7bc6-4dad-9515-bc9807edfca0" (UID: "dcab01c9-7bc6-4dad-9515-bc9807edfca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.712914 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dcab01c9-7bc6-4dad-9515-bc9807edfca0" (UID: "dcab01c9-7bc6-4dad-9515-bc9807edfca0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.713787 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dcab01c9-7bc6-4dad-9515-bc9807edfca0" (UID: "dcab01c9-7bc6-4dad-9515-bc9807edfca0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.809446 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.809785 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.809798 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcab01c9-7bc6-4dad-9515-bc9807edfca0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.844467 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96" (OuterVolumeSpecName: "kube-api-access-shq96") pod "dcab01c9-7bc6-4dad-9515-bc9807edfca0" (UID: "dcab01c9-7bc6-4dad-9515-bc9807edfca0"). InnerVolumeSpecName "kube-api-access-shq96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:21 crc kubenswrapper[4777]: W0216 21:57:21.871401 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13df94da_db3f_470b_ab61_f8ad3a4dd75d.slice/crio-8ee8e2cbf6347d2b0400bc044b2c89f6555c1ba1983d99fda8fb7ca861a5f37e WatchSource:0}: Error finding container 8ee8e2cbf6347d2b0400bc044b2c89f6555c1ba1983d99fda8fb7ca861a5f37e: Status 404 returned error can't find the container with id 8ee8e2cbf6347d2b0400bc044b2c89f6555c1ba1983d99fda8fb7ca861a5f37e Feb 16 21:57:21 crc kubenswrapper[4777]: I0216 21:57:21.912699 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shq96\" (UniqueName: \"kubernetes.io/projected/dcab01c9-7bc6-4dad-9515-bc9807edfca0-kube-api-access-shq96\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.246724 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" podStartSLOduration=12.071373793 podStartE2EDuration="18.246690597s" podCreationTimestamp="2026-02-16 21:57:04 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.548474142 +0000 UTC m=+1155.130975244" lastFinishedPulling="2026-02-16 21:57:20.723790916 +0000 UTC m=+1161.306292048" observedRunningTime="2026-02-16 21:57:21.737249789 +0000 UTC m=+1162.319750891" watchObservedRunningTime="2026-02-16 21:57:22.246690597 +0000 UTC m=+1162.829191699" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.251644 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:22 crc kubenswrapper[4777]: W0216 21:57:22.311536 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77213a6_5116_4eac_ad6d_fe007a7f53c8.slice/crio-20ace032eb33d1b4cb79c2ac1dfd2811cd2d49386b21221f1c1b6b909a6bccba WatchSource:0}: Error finding container 20ace032eb33d1b4cb79c2ac1dfd2811cd2d49386b21221f1c1b6b909a6bccba: Status 404 returned error can't find the container with id 20ace032eb33d1b4cb79c2ac1dfd2811cd2d49386b21221f1c1b6b909a6bccba Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.590658 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" event={"ID":"a24ca457-99d2-4651-9220-cfca7e58df3e","Type":"ContainerStarted","Data":"6f003f5f0c8e4a81c101548f2117d9ef299dd4522cf8a3f223d903d2bbd89324"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.591251 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.594545 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7wbfs" event={"ID":"129ef04f-890a-41e2-936b-833a227993e5","Type":"ContainerStarted","Data":"04967facbb4832e28fe24fe8089bd44f36d4a850fbd72501d87f9d75576ca69e"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.594623 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.594747 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.596527 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7","Type":"ContainerStarted","Data":"14d540ec67170d2ec7952f942f422f48ce4734b1bf5a6d075bf7bc8cfe4039a9"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.598053 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" event={"ID":"1d115cc4-a668-4cd2-b4ae-f812341416a6","Type":"ContainerStarted","Data":"d7577e59e0d7d6e300948bb7d494565f3c366716dfa308ec920624d0a6314684"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.598663 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.599599 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" event={"ID":"be9c8434-a8d2-4404-ad47-b9b91b21f439","Type":"ContainerStarted","Data":"9b65cc63b9319c331238e3e6fb73faac6cd81a32aaba922cb8491cf95cb53521"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.599997 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.603491 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a","Type":"ContainerStarted","Data":"287e460c68b8fe65b08602db49d45121d445a80ac141ca658ef63eb34be802ad"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.611822 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"ae4b796b-cd3f-4a97-a43f-9fec28e71ac7","Type":"ContainerStarted","Data":"cfccfc3199c924a8e3b1d2d36688119fe235d4a83c0dd26736cc591e9c85715d"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.612474 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.615250 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" podStartSLOduration=11.356371486 podStartE2EDuration="17.615236439s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.628104153 +0000 UTC m=+1155.210605295" lastFinishedPulling="2026-02-16 21:57:20.886969146 +0000 UTC m=+1161.469470248" observedRunningTime="2026-02-16 21:57:22.607251806 +0000 UTC m=+1163.189752908" watchObservedRunningTime="2026-02-16 21:57:22.615236439 +0000 UTC m=+1163.197737541" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.615403 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79b13da6-0857-40ec-9c96-5a5a28c6dd69","Type":"ContainerStarted","Data":"17cb1b72535eae51f1c189b0ae62ad75efd9469a8a14ec99f2e2769de171da87"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.619058 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9zcrw" event={"ID":"13df94da-db3f-470b-ab61-f8ad3a4dd75d","Type":"ContainerStarted","Data":"3f3e54436d8c521fee362b37e62710109516d6d9bfb1848fd6cf52d3f74efeb5"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.619113 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9zcrw" event={"ID":"13df94da-db3f-470b-ab61-f8ad3a4dd75d","Type":"ContainerStarted","Data":"8ee8e2cbf6347d2b0400bc044b2c89f6555c1ba1983d99fda8fb7ca861a5f37e"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.620666 4777 generic.go:334] "Generic (PLEG): container finished" podID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerID="2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150" exitCode=0 Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.620753 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" event={"ID":"e77213a6-5116-4eac-ad6d-fe007a7f53c8","Type":"ContainerDied","Data":"2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.620779 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" event={"ID":"e77213a6-5116-4eac-ad6d-fe007a7f53c8","Type":"ContainerStarted","Data":"20ace032eb33d1b4cb79c2ac1dfd2811cd2d49386b21221f1c1b6b909a6bccba"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.624008 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-zbr9z" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.625043 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"c2c4ae45-a8be-4fce-a939-53cf29c33a77","Type":"ContainerStarted","Data":"3c6c55eae83663ed6b5e0f3dab802bc5afde91bf88a01cc322de359470b478f7"} Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.625839 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.629002 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" podStartSLOduration=11.490986127 podStartE2EDuration="17.628991785s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.592804044 +0000 UTC m=+1155.175305146" lastFinishedPulling="2026-02-16 21:57:20.730809662 +0000 UTC m=+1161.313310804" observedRunningTime="2026-02-16 21:57:22.627732539 +0000 UTC m=+1163.210233641" watchObservedRunningTime="2026-02-16 21:57:22.628991785 +0000 UTC m=+1163.211492887" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.639404 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.639745 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-vsxfm" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.655990 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.785714653 podStartE2EDuration="23.65597265s" podCreationTimestamp="2026-02-16 21:56:59 +0000 UTC" firstStartedPulling="2026-02-16 21:57:04.073484068 +0000 UTC m=+1144.655985170" lastFinishedPulling="2026-02-16 21:57:20.943742065 +0000 UTC m=+1161.526243167" observedRunningTime="2026-02-16 21:57:22.653287205 +0000 UTC m=+1163.235788307" watchObservedRunningTime="2026-02-16 21:57:22.65597265 +0000 UTC m=+1163.238473752" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.688533 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-8hh82" podStartSLOduration=11.615092213 podStartE2EDuration="17.688518402s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.627932168 +0000 UTC m=+1155.210433270" lastFinishedPulling="2026-02-16 21:57:20.701358317 +0000 UTC m=+1161.283859459" observedRunningTime="2026-02-16 21:57:22.676427853 +0000 UTC m=+1163.258928985" watchObservedRunningTime="2026-02-16 21:57:22.688518402 +0000 UTC m=+1163.271019494" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.704693 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7wbfs" podStartSLOduration=17.790428688 podStartE2EDuration="25.704671944s" podCreationTimestamp="2026-02-16 21:56:57 +0000 UTC" firstStartedPulling="2026-02-16 21:57:05.521679898 +0000 UTC m=+1146.104181000" lastFinishedPulling="2026-02-16 21:57:13.435923154 +0000 UTC m=+1154.018424256" observedRunningTime="2026-02-16 21:57:22.70452714 +0000 UTC m=+1163.287028242" watchObservedRunningTime="2026-02-16 21:57:22.704671944 +0000 UTC m=+1163.287173046" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.740110 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.607162971 podStartE2EDuration="32.740083996s" podCreationTimestamp="2026-02-16 21:56:50 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.549128002 +0000 UTC m=+1144.131629104" lastFinishedPulling="2026-02-16 21:57:13.682049027 +0000 UTC m=+1154.264550129" observedRunningTime="2026-02-16 21:57:22.739067938 +0000 UTC m=+1163.321569050" watchObservedRunningTime="2026-02-16 21:57:22.740083996 +0000 UTC m=+1163.322585098" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.822769 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=11.710774822 podStartE2EDuration="17.822748281s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.628156244 +0000 UTC m=+1155.210657386" lastFinishedPulling="2026-02-16 21:57:20.740129733 +0000 UTC m=+1161.322630845" observedRunningTime="2026-02-16 21:57:22.813860822 +0000 UTC m=+1163.396361924" watchObservedRunningTime="2026-02-16 21:57:22.822748281 +0000 UTC m=+1163.405249383" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.833042 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9zcrw" podStartSLOduration=2.833020819 podStartE2EDuration="2.833020819s" podCreationTimestamp="2026-02-16 21:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:57:22.82948338 +0000 UTC m=+1163.411984482" watchObservedRunningTime="2026-02-16 21:57:22.833020819 +0000 UTC m=+1163.415521921" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.867191 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=11.7542928 podStartE2EDuration="17.867171155s" podCreationTimestamp="2026-02-16 21:57:05 +0000 UTC" firstStartedPulling="2026-02-16 21:57:14.627952428 +0000 UTC m=+1155.210453530" lastFinishedPulling="2026-02-16 21:57:20.740830743 +0000 UTC m=+1161.323331885" observedRunningTime="2026-02-16 21:57:22.848700008 +0000 UTC m=+1163.431201110" watchObservedRunningTime="2026-02-16 21:57:22.867171155 +0000 UTC m=+1163.449672257" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.881298 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-q5qdr" Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.887025 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zbr9z"] Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.898816 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-zbr9z"] Feb 16 21:57:22 crc kubenswrapper[4777]: I0216 21:57:22.905361 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.732882197 podStartE2EDuration="22.905344884s" podCreationTimestamp="2026-02-16 21:57:00 +0000 UTC" firstStartedPulling="2026-02-16 21:57:05.532867022 +0000 UTC m=+1146.115368124" lastFinishedPulling="2026-02-16 21:57:20.705329669 +0000 UTC m=+1161.287830811" observedRunningTime="2026-02-16 21:57:22.898588785 +0000 UTC m=+1163.481089887" watchObservedRunningTime="2026-02-16 21:57:22.905344884 +0000 UTC m=+1163.487845976" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.634738 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" event={"ID":"e77213a6-5116-4eac-ad6d-fe007a7f53c8","Type":"ContainerStarted","Data":"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714"} Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.635262 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.636699 4777 generic.go:334] "Generic (PLEG): container finished" podID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" exitCode=0 Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.636754 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerDied","Data":"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033"} Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.639042 4777 generic.go:334] "Generic (PLEG): container finished" podID="29233f83-ffa4-4cbf-bc7e-435e4b44cd5e" containerID="3132e517c597825426765712331e8508d23ba170b6cf4a8adbe791e532abe261" exitCode=0 Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.639159 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e","Type":"ContainerDied","Data":"3132e517c597825426765712331e8508d23ba170b6cf4a8adbe791e532abe261"} Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.695163 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" podStartSLOduration=2.695142635 podStartE2EDuration="2.695142635s" podCreationTimestamp="2026-02-16 21:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:57:23.678665233 +0000 UTC m=+1164.261166345" watchObservedRunningTime="2026-02-16 21:57:23.695142635 +0000 UTC m=+1164.277643737" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.704757 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.786366 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.840980 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.845267 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.850806 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.980382 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.980427 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.980688 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.980785 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:23 crc kubenswrapper[4777]: I0216 21:57:23.980874 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdg2p\" (UniqueName: \"kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.082737 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.082783 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.082804 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdg2p\" (UniqueName: \"kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.082878 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.082897 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.083865 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.084011 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.084378 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.084513 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.099920 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdg2p\" (UniqueName: \"kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p\") pod \"dnsmasq-dns-698758b865-rx9j2\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.169341 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.190412 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcab01c9-7bc6-4dad-9515-bc9807edfca0" path="/var/lib/kubelet/pods/dcab01c9-7bc6-4dad-9515-bc9807edfca0/volumes" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.616990 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.650803 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx9j2" event={"ID":"9043c023-eae7-4cba-bcfe-3e44fa168c38","Type":"ContainerStarted","Data":"69bc2cd7615cd7d33ad59e46294e3cb0e47554eabfdf20c4f1103ec26f163321"} Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.910053 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.918663 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.920231 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.920648 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.920969 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.921382 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jn7jz" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.927802 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997487 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3392d073-5de5-4f7e-ae87-e892f769157a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997566 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl428\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-kube-api-access-sl428\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997674 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997793 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-cache\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997814 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:24 crc kubenswrapper[4777]: I0216 21:57:24.997992 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-lock\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.106984 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-lock\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107081 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3392d073-5de5-4f7e-ae87-e892f769157a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107176 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl428\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-kube-api-access-sl428\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107216 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107279 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-cache\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107298 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.107544 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.107557 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.107600 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:25.607584402 +0000 UTC m=+1166.190085504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.107871 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-cache\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.108977 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3392d073-5de5-4f7e-ae87-e892f769157a-lock\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.115162 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.115207 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0f69479f887f81f7d25a2d029a7539d096a846f679710148147795421537c1a4/globalmount\"" pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.115619 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3392d073-5de5-4f7e-ae87-e892f769157a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.134493 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl428\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-kube-api-access-sl428\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.147869 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0e1c26f7-4289-48eb-9f02-9eb7714b37ec\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.338081 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.389420 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.540303 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.591915 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.614924 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.616250 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.616357 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: E0216 21:57:25.616460 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:26.616439474 +0000 UTC m=+1167.198940666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.661589 4777 generic.go:334] "Generic (PLEG): container finished" podID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerID="59c4a9175f1127800a7e66bfb7efbac3333819929c7c5f613beb21b8b3d5c524" exitCode=0 Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.661760 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx9j2" event={"ID":"9043c023-eae7-4cba-bcfe-3e44fa168c38","Type":"ContainerDied","Data":"59c4a9175f1127800a7e66bfb7efbac3333819929c7c5f613beb21b8b3d5c524"} Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.662514 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.662543 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.662657 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="dnsmasq-dns" containerID="cri-o://77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714" gracePeriod=10 Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.739843 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 16 21:57:25 crc kubenswrapper[4777]: I0216 21:57:25.739937 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.149457 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.151173 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.154971 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.155222 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.155463 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.156305 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qgcvn" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.169243 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.276343 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-scripts\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.276391 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.276416 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.276497 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbbk\" (UniqueName: \"kubernetes.io/projected/087976e9-29d9-4bc1-9939-e73ef0be9e0e-kube-api-access-qpbbk\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.276842 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-config\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.279289 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.279663 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.361699 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.387938 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-config\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388054 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388093 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388155 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-scripts\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388173 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388191 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388215 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbbk\" (UniqueName: \"kubernetes.io/projected/087976e9-29d9-4bc1-9939-e73ef0be9e0e-kube-api-access-qpbbk\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.388960 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.389233 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-config\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.389249 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/087976e9-29d9-4bc1-9939-e73ef0be9e0e-scripts\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.394471 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.395210 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.405675 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbbk\" (UniqueName: \"kubernetes.io/projected/087976e9-29d9-4bc1-9939-e73ef0be9e0e-kube-api-access-qpbbk\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.415501 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/087976e9-29d9-4bc1-9939-e73ef0be9e0e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"087976e9-29d9-4bc1-9939-e73ef0be9e0e\") " pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.489661 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config\") pod \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.491483 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5hfm\" (UniqueName: \"kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm\") pod \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.492904 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb\") pod \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.492940 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc\") pod \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.493013 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb\") pod \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\" (UID: \"e77213a6-5116-4eac-ad6d-fe007a7f53c8\") " Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.493042 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.496194 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm" (OuterVolumeSpecName: "kube-api-access-t5hfm") pod "e77213a6-5116-4eac-ad6d-fe007a7f53c8" (UID: "e77213a6-5116-4eac-ad6d-fe007a7f53c8"). InnerVolumeSpecName "kube-api-access-t5hfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.530269 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config" (OuterVolumeSpecName: "config") pod "e77213a6-5116-4eac-ad6d-fe007a7f53c8" (UID: "e77213a6-5116-4eac-ad6d-fe007a7f53c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.536658 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e77213a6-5116-4eac-ad6d-fe007a7f53c8" (UID: "e77213a6-5116-4eac-ad6d-fe007a7f53c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.544063 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e77213a6-5116-4eac-ad6d-fe007a7f53c8" (UID: "e77213a6-5116-4eac-ad6d-fe007a7f53c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.552624 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e77213a6-5116-4eac-ad6d-fe007a7f53c8" (UID: "e77213a6-5116-4eac-ad6d-fe007a7f53c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.594984 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.595009 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5hfm\" (UniqueName: \"kubernetes.io/projected/e77213a6-5116-4eac-ad6d-fe007a7f53c8-kube-api-access-t5hfm\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.595020 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.595030 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.595038 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e77213a6-5116-4eac-ad6d-fe007a7f53c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.683548 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx9j2" event={"ID":"9043c023-eae7-4cba-bcfe-3e44fa168c38","Type":"ContainerStarted","Data":"e0381be60a85348ea300843122315b258d68348370597d9404770c1c589231b2"} Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.684321 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.687952 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.688050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" event={"ID":"e77213a6-5116-4eac-ad6d-fe007a7f53c8","Type":"ContainerDied","Data":"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714"} Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.688082 4777 scope.go:117] "RemoveContainer" containerID="77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.687859 4777 generic.go:334] "Generic (PLEG): container finished" podID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerID="77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714" exitCode=0 Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.691707 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wsh5p" event={"ID":"e77213a6-5116-4eac-ad6d-fe007a7f53c8","Type":"ContainerDied","Data":"20ace032eb33d1b4cb79c2ac1dfd2811cd2d49386b21221f1c1b6b909a6bccba"} Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.696192 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:26 crc kubenswrapper[4777]: E0216 21:57:26.696589 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:26 crc kubenswrapper[4777]: E0216 21:57:26.696668 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:26 crc kubenswrapper[4777]: E0216 21:57:26.696778 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:28.696763121 +0000 UTC m=+1169.279264223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.709098 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-rx9j2" podStartSLOduration=3.7090595459999998 podStartE2EDuration="3.709059546s" podCreationTimestamp="2026-02-16 21:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:57:26.704121007 +0000 UTC m=+1167.286622109" watchObservedRunningTime="2026-02-16 21:57:26.709059546 +0000 UTC m=+1167.291560648" Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.731171 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.737325 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wsh5p"] Feb 16 21:57:26 crc kubenswrapper[4777]: I0216 21:57:26.947905 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 16 21:57:27 crc kubenswrapper[4777]: W0216 21:57:27.339510 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod087976e9_29d9_4bc1_9939_e73ef0be9e0e.slice/crio-8cd3aaa23760dbb1085839c3a7936e1f51fd2f91998b6fc45b36ecf32c0c44c4 WatchSource:0}: Error finding container 8cd3aaa23760dbb1085839c3a7936e1f51fd2f91998b6fc45b36ecf32c0c44c4: Status 404 returned error can't find the container with id 8cd3aaa23760dbb1085839c3a7936e1f51fd2f91998b6fc45b36ecf32c0c44c4 Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.346988 4777 scope.go:117] "RemoveContainer" containerID="2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150" Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.396278 4777 scope.go:117] "RemoveContainer" containerID="77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714" Feb 16 21:57:27 crc kubenswrapper[4777]: E0216 21:57:27.396743 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714\": container with ID starting with 77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714 not found: ID does not exist" containerID="77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714" Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.396781 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714"} err="failed to get container status \"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714\": rpc error: code = NotFound desc = could not find container \"77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714\": container with ID starting with 77626b14338919581feef0ff5feb810ba67d1e7949419f4bbc52ff729c3e2714 not found: ID does not exist" Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.396808 4777 scope.go:117] "RemoveContainer" containerID="2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150" Feb 16 21:57:27 crc kubenswrapper[4777]: E0216 21:57:27.397322 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150\": container with ID starting with 2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150 not found: ID does not exist" containerID="2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150" Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.397395 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150"} err="failed to get container status \"2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150\": rpc error: code = NotFound desc = could not find container \"2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150\": container with ID starting with 2fc2883fbef7fd45213091e33c5a956ce28c1bc8b1f74c9e805dac88b73a6150 not found: ID does not exist" Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.700832 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e","Type":"ContainerStarted","Data":"0e0222372c9d17ef246b938b199981fb7553a099f8841246937107ecd10e461a"} Feb 16 21:57:27 crc kubenswrapper[4777]: I0216 21:57:27.702281 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"087976e9-29d9-4bc1-9939-e73ef0be9e0e","Type":"ContainerStarted","Data":"8cd3aaa23760dbb1085839c3a7936e1f51fd2f91998b6fc45b36ecf32c0c44c4"} Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.192100 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" path="/var/lib/kubelet/pods/e77213a6-5116-4eac-ad6d-fe007a7f53c8/volumes" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.732980 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:28 crc kubenswrapper[4777]: E0216 21:57:28.733312 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:28 crc kubenswrapper[4777]: E0216 21:57:28.733356 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:28 crc kubenswrapper[4777]: E0216 21:57:28.733451 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:32.733421392 +0000 UTC m=+1173.315922524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.894451 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rjx9d"] Feb 16 21:57:28 crc kubenswrapper[4777]: E0216 21:57:28.896054 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="init" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.896092 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="init" Feb 16 21:57:28 crc kubenswrapper[4777]: E0216 21:57:28.896158 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="dnsmasq-dns" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.896172 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="dnsmasq-dns" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.896532 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77213a6-5116-4eac-ad6d-fe007a7f53c8" containerName="dnsmasq-dns" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.897629 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.901198 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.901245 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.901251 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 16 21:57:28 crc kubenswrapper[4777]: I0216 21:57:28.913076 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjx9d"] Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037346 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037387 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037459 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037731 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn89z\" (UniqueName: \"kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037801 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.037853 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.038093 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.140804 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.140892 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.140912 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.140980 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.141284 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn89z\" (UniqueName: \"kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.141326 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.141366 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.141730 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.142312 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.143261 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.148617 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.148803 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.151418 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.166185 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn89z\" (UniqueName: \"kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z\") pod \"swift-ring-rebalance-rjx9d\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.233393 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.756470 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rjx9d"] Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.855013 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.855280 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 16 21:57:29 crc kubenswrapper[4777]: I0216 21:57:29.961571 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 16 21:57:30 crc kubenswrapper[4777]: I0216 21:57:30.747022 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"29233f83-ffa4-4cbf-bc7e-435e4b44cd5e","Type":"ContainerStarted","Data":"79bd2a04488bce1d96f1df4216f9f98363aa08307b99b2a3a4edac227ab7bd08"} Feb 16 21:57:30 crc kubenswrapper[4777]: I0216 21:57:30.748330 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 16 21:57:30 crc kubenswrapper[4777]: I0216 21:57:30.750315 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 16 21:57:30 crc kubenswrapper[4777]: I0216 21:57:30.777990 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=13.29556225 podStartE2EDuration="36.777948384s" podCreationTimestamp="2026-02-16 21:56:54 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.929075194 +0000 UTC m=+1144.511576296" lastFinishedPulling="2026-02-16 21:57:27.411461328 +0000 UTC m=+1167.993962430" observedRunningTime="2026-02-16 21:57:30.776170074 +0000 UTC m=+1171.358671226" watchObservedRunningTime="2026-02-16 21:57:30.777948384 +0000 UTC m=+1171.360449486" Feb 16 21:57:30 crc kubenswrapper[4777]: I0216 21:57:30.889172 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.420239 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.420280 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.505093 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.805045 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2f2h7"] Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.806493 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.817227 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2f2h7"] Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.862300 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3613-account-create-update-g4zwl"] Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.863581 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.865908 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.878208 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3613-account-create-update-g4zwl"] Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.879665 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.898146 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54pl\" (UniqueName: \"kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.898278 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.898310 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:31 crc kubenswrapper[4777]: I0216 21:57:31.899796 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9pq2\" (UniqueName: \"kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.001815 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54pl\" (UniqueName: \"kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.002019 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.002065 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.002100 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9pq2\" (UniqueName: \"kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.003567 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.004643 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.023324 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9pq2\" (UniqueName: \"kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2\") pod \"glance-db-create-2f2h7\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.023776 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54pl\" (UniqueName: \"kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl\") pod \"glance-3613-account-create-update-g4zwl\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.131909 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.178791 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:32 crc kubenswrapper[4777]: W0216 21:57:32.308373 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf855cb15_8085_43b1_825e_e6316c580924.slice/crio-cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec WatchSource:0}: Error finding container cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec: Status 404 returned error can't find the container with id cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.710516 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-n67bs"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.712071 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.719039 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d3e5-account-create-update-gb48v"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.725424 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.726662 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.726893 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n67bs"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.755665 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.755735 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qghhk\" (UniqueName: \"kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.755830 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.755872 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8487\" (UniqueName: \"kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.755902 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: E0216 21:57:32.756101 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:32 crc kubenswrapper[4777]: E0216 21:57:32.756114 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:32 crc kubenswrapper[4777]: E0216 21:57:32.756157 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:40.756143057 +0000 UTC m=+1181.338644159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.756192 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3e5-account-create-update-gb48v"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.771098 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjx9d" event={"ID":"f855cb15-8085-43b1-825e-e6316c580924","Type":"ContainerStarted","Data":"cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec"} Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.774927 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerStarted","Data":"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36"} Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.860024 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8487\" (UniqueName: \"kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.860078 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.860121 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.860157 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qghhk\" (UniqueName: \"kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.861056 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.861896 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.876499 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8487\" (UniqueName: \"kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487\") pod \"keystone-db-create-n67bs\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.886786 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qghhk\" (UniqueName: \"kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk\") pod \"keystone-d3e5-account-create-update-gb48v\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.916345 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gk52z"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.918352 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gk52z" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.986222 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ed2-account-create-update-bfgz5"] Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.987387 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:32 crc kubenswrapper[4777]: I0216 21:57:32.990333 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.013787 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3613-account-create-update-g4zwl"] Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.038768 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gk52z"] Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.054975 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ed2-account-create-update-bfgz5"] Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.078187 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.078257 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5b4m\" (UniqueName: \"kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.078507 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.093589 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.143162 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2f2h7"] Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.179699 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kf57\" (UniqueName: \"kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.179742 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.179817 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.179963 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5b4m\" (UniqueName: \"kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.191857 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.204133 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5b4m\" (UniqueName: \"kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m\") pod \"placement-db-create-gk52z\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.281872 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kf57\" (UniqueName: \"kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.282093 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.282879 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.308379 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gk52z" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.309960 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kf57\" (UniqueName: \"kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57\") pod \"placement-5ed2-account-create-update-bfgz5\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.322791 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.629853 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n67bs"] Feb 16 21:57:33 crc kubenswrapper[4777]: W0216 21:57:33.696506 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod525a2ff3_f0fb_45fb_8231_e18cea438b9c.slice/crio-4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985 WatchSource:0}: Error finding container 4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985: Status 404 returned error can't find the container with id 4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985 Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.782933 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"087976e9-29d9-4bc1-9939-e73ef0be9e0e","Type":"ContainerStarted","Data":"f9000ab0a7aa65e6e17dddeb4246b9ffd841b558bb111f1493e03777998ad709"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.782971 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"087976e9-29d9-4bc1-9939-e73ef0be9e0e","Type":"ContainerStarted","Data":"f697756a9e115f7c05190f2c107eaf6f7a069d06049c5f11c9dc0f0ea22190af"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.783075 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.783418 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3e5-account-create-update-gb48v"] Feb 16 21:57:33 crc kubenswrapper[4777]: W0216 21:57:33.786002 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53cb73a_1d77_4527_b9c5_34f2091972a3.slice/crio-0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a WatchSource:0}: Error finding container 0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a: Status 404 returned error can't find the container with id 0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.787237 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n67bs" event={"ID":"525a2ff3-f0fb-45fb-8231-e18cea438b9c","Type":"ContainerStarted","Data":"4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.789792 4777 generic.go:334] "Generic (PLEG): container finished" podID="473b7fa7-2de5-4e0a-921e-80880017c429" containerID="16e74f80d34047aae280d4cb00f66b7b04f522d3e15d239d403bdad590286693" exitCode=0 Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.789864 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3613-account-create-update-g4zwl" event={"ID":"473b7fa7-2de5-4e0a-921e-80880017c429","Type":"ContainerDied","Data":"16e74f80d34047aae280d4cb00f66b7b04f522d3e15d239d403bdad590286693"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.789891 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3613-account-create-update-g4zwl" event={"ID":"473b7fa7-2de5-4e0a-921e-80880017c429","Type":"ContainerStarted","Data":"981074ef086d60e4b609fd09ec80eee34b214638ae71be30412db98f8243c978"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.794062 4777 generic.go:334] "Generic (PLEG): container finished" podID="e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" containerID="58e1fae68ae66eb14d8ac2623ba9582eab0d8f91d406bfa69593cabf55636cb3" exitCode=0 Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.794103 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2f2h7" event={"ID":"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a","Type":"ContainerDied","Data":"58e1fae68ae66eb14d8ac2623ba9582eab0d8f91d406bfa69593cabf55636cb3"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.794122 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2f2h7" event={"ID":"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a","Type":"ContainerStarted","Data":"9d9fd7594f883faeb0ecd00788fc31eeb440f055e4491b17abb5243e60f83183"} Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.804085 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.609775681 podStartE2EDuration="7.804069596s" podCreationTimestamp="2026-02-16 21:57:26 +0000 UTC" firstStartedPulling="2026-02-16 21:57:27.382936019 +0000 UTC m=+1167.965437121" lastFinishedPulling="2026-02-16 21:57:29.577229924 +0000 UTC m=+1170.159731036" observedRunningTime="2026-02-16 21:57:33.800037593 +0000 UTC m=+1174.382538695" watchObservedRunningTime="2026-02-16 21:57:33.804069596 +0000 UTC m=+1174.386570698" Feb 16 21:57:33 crc kubenswrapper[4777]: W0216 21:57:33.874694 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e168960_b8cc_45a1_98e5_5b2157b299a2.slice/crio-644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a WatchSource:0}: Error finding container 644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a: Status 404 returned error can't find the container with id 644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.876927 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gk52z"] Feb 16 21:57:33 crc kubenswrapper[4777]: I0216 21:57:33.931169 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ed2-account-create-update-bfgz5"] Feb 16 21:57:33 crc kubenswrapper[4777]: W0216 21:57:33.941268 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f33b75b_727b_44c0_a663_672eb02c8862.slice/crio-f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c WatchSource:0}: Error finding container f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c: Status 404 returned error can't find the container with id f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.171215 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.229737 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.229953 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="dnsmasq-dns" containerID="cri-o://361656afb488844efa84b835e661a92e001020b9017a597d9cb5058a39ff063c" gracePeriod=10 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.805295 4777 generic.go:334] "Generic (PLEG): container finished" podID="1f33b75b-727b-44c0-a663-672eb02c8862" containerID="a8dd4a159b0dcc5a5eb78e45698ccdde0b68f7b0cc10ff560d6086af4b139ec4" exitCode=0 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.805353 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ed2-account-create-update-bfgz5" event={"ID":"1f33b75b-727b-44c0-a663-672eb02c8862","Type":"ContainerDied","Data":"a8dd4a159b0dcc5a5eb78e45698ccdde0b68f7b0cc10ff560d6086af4b139ec4"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.805376 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ed2-account-create-update-bfgz5" event={"ID":"1f33b75b-727b-44c0-a663-672eb02c8862","Type":"ContainerStarted","Data":"f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.808320 4777 generic.go:334] "Generic (PLEG): container finished" podID="150daccc-a84b-457f-8685-468d3b017302" containerID="361656afb488844efa84b835e661a92e001020b9017a597d9cb5058a39ff063c" exitCode=0 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.808366 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" event={"ID":"150daccc-a84b-457f-8685-468d3b017302","Type":"ContainerDied","Data":"361656afb488844efa84b835e661a92e001020b9017a597d9cb5058a39ff063c"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.810175 4777 generic.go:334] "Generic (PLEG): container finished" podID="2e168960-b8cc-45a1-98e5-5b2157b299a2" containerID="d35620637999f5cd8ece8a4c7c76a47bf52b4ed6004267c2a839062312294c31" exitCode=0 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.810223 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gk52z" event={"ID":"2e168960-b8cc-45a1-98e5-5b2157b299a2","Type":"ContainerDied","Data":"d35620637999f5cd8ece8a4c7c76a47bf52b4ed6004267c2a839062312294c31"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.810238 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gk52z" event={"ID":"2e168960-b8cc-45a1-98e5-5b2157b299a2","Type":"ContainerStarted","Data":"644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.812017 4777 generic.go:334] "Generic (PLEG): container finished" podID="f53cb73a-1d77-4527-b9c5-34f2091972a3" containerID="019d0a474c1289621b07f9f000fb21ff4781d5d0d6fdf6d2e59f3aa45a1675f9" exitCode=0 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.812065 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3e5-account-create-update-gb48v" event={"ID":"f53cb73a-1d77-4527-b9c5-34f2091972a3","Type":"ContainerDied","Data":"019d0a474c1289621b07f9f000fb21ff4781d5d0d6fdf6d2e59f3aa45a1675f9"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.812079 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3e5-account-create-update-gb48v" event={"ID":"f53cb73a-1d77-4527-b9c5-34f2091972a3","Type":"ContainerStarted","Data":"0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a"} Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.813906 4777 generic.go:334] "Generic (PLEG): container finished" podID="525a2ff3-f0fb-45fb-8231-e18cea438b9c" containerID="e6e0fac9a23539a5e1997ed9ef0cf2b5b25096a74896d26c9a4a784460fffcb8" exitCode=0 Feb 16 21:57:34 crc kubenswrapper[4777]: I0216 21:57:34.814118 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n67bs" event={"ID":"525a2ff3-f0fb-45fb-8231-e18cea438b9c","Type":"ContainerDied","Data":"e6e0fac9a23539a5e1997ed9ef0cf2b5b25096a74896d26c9a4a784460fffcb8"} Feb 16 21:57:35 crc kubenswrapper[4777]: I0216 21:57:35.824078 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerStarted","Data":"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194"} Feb 16 21:57:36 crc kubenswrapper[4777]: I0216 21:57:36.399626 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a882b0c3-7f2e-446b-aea4-476cacffb112" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.486101 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.495444 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.532587 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.543460 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.557762 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.570636 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.572795 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts\") pod \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.572981 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8487\" (UniqueName: \"kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487\") pod \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\" (UID: \"525a2ff3-f0fb-45fb-8231-e18cea438b9c\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.573586 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "525a2ff3-f0fb-45fb-8231-e18cea438b9c" (UID: "525a2ff3-f0fb-45fb-8231-e18cea438b9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.577853 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487" (OuterVolumeSpecName: "kube-api-access-x8487") pod "525a2ff3-f0fb-45fb-8231-e18cea438b9c" (UID: "525a2ff3-f0fb-45fb-8231-e18cea438b9c"). InnerVolumeSpecName "kube-api-access-x8487". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.593994 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gk52z" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676333 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts\") pod \"473b7fa7-2de5-4e0a-921e-80880017c429\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676758 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config\") pod \"150daccc-a84b-457f-8685-468d3b017302\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676812 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z54pl\" (UniqueName: \"kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl\") pod \"473b7fa7-2de5-4e0a-921e-80880017c429\" (UID: \"473b7fa7-2de5-4e0a-921e-80880017c429\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676839 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc\") pod \"150daccc-a84b-457f-8685-468d3b017302\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676862 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts\") pod \"f53cb73a-1d77-4527-b9c5-34f2091972a3\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676903 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kf57\" (UniqueName: \"kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57\") pod \"1f33b75b-727b-44c0-a663-672eb02c8862\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676923 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9pq2\" (UniqueName: \"kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2\") pod \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676957 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qghhk\" (UniqueName: \"kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk\") pod \"f53cb73a-1d77-4527-b9c5-34f2091972a3\" (UID: \"f53cb73a-1d77-4527-b9c5-34f2091972a3\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676988 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5b4m\" (UniqueName: \"kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m\") pod \"2e168960-b8cc-45a1-98e5-5b2157b299a2\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.676991 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "473b7fa7-2de5-4e0a-921e-80880017c429" (UID: "473b7fa7-2de5-4e0a-921e-80880017c429"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677004 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts\") pod \"1f33b75b-727b-44c0-a663-672eb02c8862\" (UID: \"1f33b75b-727b-44c0-a663-672eb02c8862\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677023 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts\") pod \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\" (UID: \"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677050 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts\") pod \"2e168960-b8cc-45a1-98e5-5b2157b299a2\" (UID: \"2e168960-b8cc-45a1-98e5-5b2157b299a2\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677098 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9zw4\" (UniqueName: \"kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4\") pod \"150daccc-a84b-457f-8685-468d3b017302\" (UID: \"150daccc-a84b-457f-8685-468d3b017302\") " Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677753 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/525a2ff3-f0fb-45fb-8231-e18cea438b9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677767 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473b7fa7-2de5-4e0a-921e-80880017c429-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.677776 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8487\" (UniqueName: \"kubernetes.io/projected/525a2ff3-f0fb-45fb-8231-e18cea438b9c-kube-api-access-x8487\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.679784 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57" (OuterVolumeSpecName: "kube-api-access-5kf57") pod "1f33b75b-727b-44c0-a663-672eb02c8862" (UID: "1f33b75b-727b-44c0-a663-672eb02c8862"). InnerVolumeSpecName "kube-api-access-5kf57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.680168 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2" (OuterVolumeSpecName: "kube-api-access-m9pq2") pod "e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" (UID: "e9eb4b5f-7a71-4f57-837b-73f4544c0a4a"). InnerVolumeSpecName "kube-api-access-m9pq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.680573 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" (UID: "e9eb4b5f-7a71-4f57-837b-73f4544c0a4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.680999 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f53cb73a-1d77-4527-b9c5-34f2091972a3" (UID: "f53cb73a-1d77-4527-b9c5-34f2091972a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.681037 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e168960-b8cc-45a1-98e5-5b2157b299a2" (UID: "2e168960-b8cc-45a1-98e5-5b2157b299a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.681358 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f33b75b-727b-44c0-a663-672eb02c8862" (UID: "1f33b75b-727b-44c0-a663-672eb02c8862"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.682795 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk" (OuterVolumeSpecName: "kube-api-access-qghhk") pod "f53cb73a-1d77-4527-b9c5-34f2091972a3" (UID: "f53cb73a-1d77-4527-b9c5-34f2091972a3"). InnerVolumeSpecName "kube-api-access-qghhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.683892 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl" (OuterVolumeSpecName: "kube-api-access-z54pl") pod "473b7fa7-2de5-4e0a-921e-80880017c429" (UID: "473b7fa7-2de5-4e0a-921e-80880017c429"). InnerVolumeSpecName "kube-api-access-z54pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.684056 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m" (OuterVolumeSpecName: "kube-api-access-d5b4m") pod "2e168960-b8cc-45a1-98e5-5b2157b299a2" (UID: "2e168960-b8cc-45a1-98e5-5b2157b299a2"). InnerVolumeSpecName "kube-api-access-d5b4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.684422 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4" (OuterVolumeSpecName: "kube-api-access-g9zw4") pod "150daccc-a84b-457f-8685-468d3b017302" (UID: "150daccc-a84b-457f-8685-468d3b017302"). InnerVolumeSpecName "kube-api-access-g9zw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.719236 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config" (OuterVolumeSpecName: "config") pod "150daccc-a84b-457f-8685-468d3b017302" (UID: "150daccc-a84b-457f-8685-468d3b017302"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.735799 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "150daccc-a84b-457f-8685-468d3b017302" (UID: "150daccc-a84b-457f-8685-468d3b017302"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779173 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779203 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z54pl\" (UniqueName: \"kubernetes.io/projected/473b7fa7-2de5-4e0a-921e-80880017c429-kube-api-access-z54pl\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779217 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/150daccc-a84b-457f-8685-468d3b017302-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779231 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cb73a-1d77-4527-b9c5-34f2091972a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779243 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kf57\" (UniqueName: \"kubernetes.io/projected/1f33b75b-727b-44c0-a663-672eb02c8862-kube-api-access-5kf57\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779257 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9pq2\" (UniqueName: \"kubernetes.io/projected/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-kube-api-access-m9pq2\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779269 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qghhk\" (UniqueName: \"kubernetes.io/projected/f53cb73a-1d77-4527-b9c5-34f2091972a3-kube-api-access-qghhk\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779281 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5b4m\" (UniqueName: \"kubernetes.io/projected/2e168960-b8cc-45a1-98e5-5b2157b299a2-kube-api-access-d5b4m\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779293 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f33b75b-727b-44c0-a663-672eb02c8862-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779305 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779316 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e168960-b8cc-45a1-98e5-5b2157b299a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.779328 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9zw4\" (UniqueName: \"kubernetes.io/projected/150daccc-a84b-457f-8685-468d3b017302-kube-api-access-g9zw4\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.840014 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3e5-account-create-update-gb48v" event={"ID":"f53cb73a-1d77-4527-b9c5-34f2091972a3","Type":"ContainerDied","Data":"0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.840048 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e38f4f1fb5ec2bb8ac1f8b9faab167ea9e14419158815a7981fd28f9e6d1e2a" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.840058 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3e5-account-create-update-gb48v" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.841447 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n67bs" event={"ID":"525a2ff3-f0fb-45fb-8231-e18cea438b9c","Type":"ContainerDied","Data":"4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.841474 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c9418fb2f7a67d800de0ff141a53758164985de6561533a8661b1debd30a985" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.841550 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n67bs" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.851139 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjx9d" event={"ID":"f855cb15-8085-43b1-825e-e6316c580924","Type":"ContainerStarted","Data":"fdc613986a99ffe553201605a55ba8f39041b662586f77f0e728fd305f8a45b6"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.853203 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ed2-account-create-update-bfgz5" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.853228 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ed2-account-create-update-bfgz5" event={"ID":"1f33b75b-727b-44c0-a663-672eb02c8862","Type":"ContainerDied","Data":"f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.853269 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5db6320a8d29339717e18e34eb717027bcea791ad5551a74c9833ebf3a0949c" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.867363 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rjx9d" podStartSLOduration=4.879512022 podStartE2EDuration="9.867346917s" podCreationTimestamp="2026-02-16 21:57:28 +0000 UTC" firstStartedPulling="2026-02-16 21:57:32.337312076 +0000 UTC m=+1172.919813178" lastFinishedPulling="2026-02-16 21:57:37.325146971 +0000 UTC m=+1177.907648073" observedRunningTime="2026-02-16 21:57:37.865464114 +0000 UTC m=+1178.447965206" watchObservedRunningTime="2026-02-16 21:57:37.867346917 +0000 UTC m=+1178.449848019" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.872448 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2f2h7" event={"ID":"e9eb4b5f-7a71-4f57-837b-73f4544c0a4a","Type":"ContainerDied","Data":"9d9fd7594f883faeb0ecd00788fc31eeb440f055e4491b17abb5243e60f83183"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.872521 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d9fd7594f883faeb0ecd00788fc31eeb440f055e4491b17abb5243e60f83183" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.872632 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2f2h7" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.877434 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3613-account-create-update-g4zwl" event={"ID":"473b7fa7-2de5-4e0a-921e-80880017c429","Type":"ContainerDied","Data":"981074ef086d60e4b609fd09ec80eee34b214638ae71be30412db98f8243c978"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.877640 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981074ef086d60e4b609fd09ec80eee34b214638ae71be30412db98f8243c978" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.877858 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3613-account-create-update-g4zwl" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.885421 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" event={"ID":"150daccc-a84b-457f-8685-468d3b017302","Type":"ContainerDied","Data":"e783b3b5cd3a3192cec9876457ccaa3c35add34b06bfaea87cc891ad24ef3b0b"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.885492 4777 scope.go:117] "RemoveContainer" containerID="361656afb488844efa84b835e661a92e001020b9017a597d9cb5058a39ff063c" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.885489 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.891252 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gk52z" event={"ID":"2e168960-b8cc-45a1-98e5-5b2157b299a2","Type":"ContainerDied","Data":"644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a"} Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.891277 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="644082621c5e8c78db5c8af547fdae127291716f8e3ac295fd34025d48b5cf9a" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.891318 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gk52z" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.912849 4777 scope.go:117] "RemoveContainer" containerID="bcbc16120135eeb57db1c4f6be780013a3ffd0a1e492e6de5f840dbdfe5f2507" Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.940107 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:57:37 crc kubenswrapper[4777]: I0216 21:57:37.946173 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n77h7"] Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.103937 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod150daccc_a84b_457f_8685_468d3b017302.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod150daccc_a84b_457f_8685_468d3b017302.slice/crio-e783b3b5cd3a3192cec9876457ccaa3c35add34b06bfaea87cc891ad24ef3b0b\": RecentStats: unable to find data in memory cache]" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.193485 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150daccc-a84b-457f-8685-468d3b017302" path="/var/lib/kubelet/pods/150daccc-a84b-457f-8685-468d3b017302/volumes" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448393 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cqzjw"] Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448775 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="init" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448794 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="init" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448815 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e168960-b8cc-45a1-98e5-5b2157b299a2" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448823 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e168960-b8cc-45a1-98e5-5b2157b299a2" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448834 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f33b75b-727b-44c0-a663-672eb02c8862" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448841 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f33b75b-727b-44c0-a663-672eb02c8862" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448855 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53cb73a-1d77-4527-b9c5-34f2091972a3" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448860 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53cb73a-1d77-4527-b9c5-34f2091972a3" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448869 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473b7fa7-2de5-4e0a-921e-80880017c429" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448876 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="473b7fa7-2de5-4e0a-921e-80880017c429" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448887 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="dnsmasq-dns" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448893 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="dnsmasq-dns" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448907 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525a2ff3-f0fb-45fb-8231-e18cea438b9c" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448913 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="525a2ff3-f0fb-45fb-8231-e18cea438b9c" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: E0216 21:57:38.448922 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.448928 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449076 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53cb73a-1d77-4527-b9c5-34f2091972a3" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449089 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f33b75b-727b-44c0-a663-672eb02c8862" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449096 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="dnsmasq-dns" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449106 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e168960-b8cc-45a1-98e5-5b2157b299a2" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449117 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="473b7fa7-2de5-4e0a-921e-80880017c429" containerName="mariadb-account-create-update" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449126 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449133 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="525a2ff3-f0fb-45fb-8231-e18cea438b9c" containerName="mariadb-database-create" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.449799 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.451683 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.475634 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cqzjw"] Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.611490 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.611536 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt4ss\" (UniqueName: \"kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.713411 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.713457 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt4ss\" (UniqueName: \"kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.714282 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.741647 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt4ss\" (UniqueName: \"kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss\") pod \"root-account-create-update-cqzjw\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:38 crc kubenswrapper[4777]: I0216 21:57:38.771649 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:39 crc kubenswrapper[4777]: I0216 21:57:39.908621 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerStarted","Data":"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118"} Feb 16 21:57:39 crc kubenswrapper[4777]: I0216 21:57:39.945041 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.292571154 podStartE2EDuration="46.945019736s" podCreationTimestamp="2026-02-16 21:56:53 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.936614205 +0000 UTC m=+1144.519115307" lastFinishedPulling="2026-02-16 21:57:39.589062777 +0000 UTC m=+1180.171563889" observedRunningTime="2026-02-16 21:57:39.931504058 +0000 UTC m=+1180.514005190" watchObservedRunningTime="2026-02-16 21:57:39.945019736 +0000 UTC m=+1180.527520838" Feb 16 21:57:40 crc kubenswrapper[4777]: W0216 21:57:40.037844 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106c9067_3fff_47b4_a27f_5d2a70d8d976.slice/crio-201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c WatchSource:0}: Error finding container 201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c: Status 404 returned error can't find the container with id 201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.038105 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cqzjw"] Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.064742 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.065077 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.067349 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.796168 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:40 crc kubenswrapper[4777]: E0216 21:57:40.796353 4777 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 16 21:57:40 crc kubenswrapper[4777]: E0216 21:57:40.796696 4777 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 16 21:57:40 crc kubenswrapper[4777]: E0216 21:57:40.796854 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift podName:3392d073-5de5-4f7e-ae87-e892f769157a nodeName:}" failed. No retries permitted until 2026-02-16 21:57:56.796835143 +0000 UTC m=+1197.379336255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift") pod "swift-storage-0" (UID: "3392d073-5de5-4f7e-ae87-e892f769157a") : configmap "swift-ring-files" not found Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.948371 4777 generic.go:334] "Generic (PLEG): container finished" podID="106c9067-3fff-47b4-a27f-5d2a70d8d976" containerID="6b6202e2fcd41bc1f83a6eb4180a9b5138121d6dfe1affff17682819aeeee2ff" exitCode=0 Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.949853 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cqzjw" event={"ID":"106c9067-3fff-47b4-a27f-5d2a70d8d976","Type":"ContainerDied","Data":"6b6202e2fcd41bc1f83a6eb4180a9b5138121d6dfe1affff17682819aeeee2ff"} Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.949911 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cqzjw" event={"ID":"106c9067-3fff-47b4-a27f-5d2a70d8d976","Type":"ContainerStarted","Data":"201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c"} Feb 16 21:57:40 crc kubenswrapper[4777]: I0216 21:57:40.950832 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.216990 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fwhrq"] Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.218602 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.223575 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.223741 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cbq8q" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.229658 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.229756 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.229778 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xn8l\" (UniqueName: \"kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.229804 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.275590 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fwhrq"] Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.331907 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.332024 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xn8l\" (UniqueName: \"kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.332040 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.332063 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.337852 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.339137 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.340322 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.349883 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xn8l\" (UniqueName: \"kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l\") pod \"glance-db-sync-fwhrq\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.443773 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.513079 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-n77h7" podUID="150daccc-a84b-457f-8685-468d3b017302" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: i/o timeout" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.535246 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt4ss\" (UniqueName: \"kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss\") pod \"106c9067-3fff-47b4-a27f-5d2a70d8d976\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.535666 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts\") pod \"106c9067-3fff-47b4-a27f-5d2a70d8d976\" (UID: \"106c9067-3fff-47b4-a27f-5d2a70d8d976\") " Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.536025 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "106c9067-3fff-47b4-a27f-5d2a70d8d976" (UID: "106c9067-3fff-47b4-a27f-5d2a70d8d976"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.536265 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106c9067-3fff-47b4-a27f-5d2a70d8d976-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.539319 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss" (OuterVolumeSpecName: "kube-api-access-zt4ss") pod "106c9067-3fff-47b4-a27f-5d2a70d8d976" (UID: "106c9067-3fff-47b4-a27f-5d2a70d8d976"). InnerVolumeSpecName "kube-api-access-zt4ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.544606 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fwhrq" Feb 16 21:57:42 crc kubenswrapper[4777]: I0216 21:57:42.639772 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt4ss\" (UniqueName: \"kubernetes.io/projected/106c9067-3fff-47b4-a27f-5d2a70d8d976-kube-api-access-zt4ss\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:43 crc kubenswrapper[4777]: I0216 21:57:43.000369 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cqzjw" event={"ID":"106c9067-3fff-47b4-a27f-5d2a70d8d976","Type":"ContainerDied","Data":"201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c"} Feb 16 21:57:43 crc kubenswrapper[4777]: I0216 21:57:43.000662 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="201e9782b7abdc4765379f37f7c07c9788fcd7e4a68652664a345f9d59b4556c" Feb 16 21:57:43 crc kubenswrapper[4777]: I0216 21:57:43.000613 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cqzjw" Feb 16 21:57:43 crc kubenswrapper[4777]: I0216 21:57:43.328507 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fwhrq"] Feb 16 21:57:43 crc kubenswrapper[4777]: I0216 21:57:43.359925 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.012096 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="prometheus" containerID="cri-o://60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" gracePeriod=600 Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.012580 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fwhrq" event={"ID":"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c","Type":"ContainerStarted","Data":"b83eb4f8b3c7c00edc9ea4ee2717ca366e91c1b11d3d5025a5393c273b349887"} Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.012897 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="thanos-sidecar" containerID="cri-o://cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" gracePeriod=600 Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.012947 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="config-reloader" containerID="cri-o://82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" gracePeriod=600 Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.572977 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.578779 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvkm\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.578838 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.578872 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.578897 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579056 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579094 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579131 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579198 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579289 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.579320 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config\") pod \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\" (UID: \"2a88f6b0-198b-4bb9-a593-607fae0b0a8d\") " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.580526 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.580585 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.580856 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.585515 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out" (OuterVolumeSpecName: "config-out") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.585647 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.585748 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm" (OuterVolumeSpecName: "kube-api-access-9mvkm") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "kube-api-access-9mvkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.585897 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.586111 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config" (OuterVolumeSpecName: "config") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.651539 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config" (OuterVolumeSpecName: "web-config") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.669372 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "2a88f6b0-198b-4bb9-a593-607fae0b0a8d" (UID: "2a88f6b0-198b-4bb9-a593-607fae0b0a8d"). InnerVolumeSpecName "pvc-42969b9e-7935-49d1-9a41-f57568ab28e3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680743 4777 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680771 4777 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config-out\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680782 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680793 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvkm\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-kube-api-access-9mvkm\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680803 4777 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680813 4777 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680823 4777 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680856 4777 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") on node \"crc\" " Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680866 4777 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.680875 4777 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2a88f6b0-198b-4bb9-a593-607fae0b0a8d-web-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.714868 4777 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.715089 4777 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-42969b9e-7935-49d1-9a41-f57568ab28e3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3") on node "crc" Feb 16 21:57:44 crc kubenswrapper[4777]: I0216 21:57:44.793263 4777 reconciler_common.go:293] "Volume detached for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.016331 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cqzjw"] Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.025042 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cqzjw"] Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.030348 4777 generic.go:334] "Generic (PLEG): container finished" podID="f855cb15-8085-43b1-825e-e6316c580924" containerID="fdc613986a99ffe553201605a55ba8f39041b662586f77f0e728fd305f8a45b6" exitCode=0 Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.030429 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjx9d" event={"ID":"f855cb15-8085-43b1-825e-e6316c580924","Type":"ContainerDied","Data":"fdc613986a99ffe553201605a55ba8f39041b662586f77f0e728fd305f8a45b6"} Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040626 4777 generic.go:334] "Generic (PLEG): container finished" podID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" exitCode=0 Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040657 4777 generic.go:334] "Generic (PLEG): container finished" podID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" exitCode=0 Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040664 4777 generic.go:334] "Generic (PLEG): container finished" podID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" exitCode=0 Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040686 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerDied","Data":"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118"} Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040724 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerDied","Data":"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194"} Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040735 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerDied","Data":"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36"} Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040744 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2a88f6b0-198b-4bb9-a593-607fae0b0a8d","Type":"ContainerDied","Data":"0dec1232a0f60ed5380da83440d8733f4662b9c87536123ce65092d54ebd85c6"} Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040759 4777 scope.go:117] "RemoveContainer" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.040907 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.080393 4777 scope.go:117] "RemoveContainer" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.081349 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.103565 4777 scope.go:117] "RemoveContainer" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.112894 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.139291 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.139872 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="thanos-sidecar" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.139913 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="thanos-sidecar" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.139927 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="config-reloader" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.139933 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="config-reloader" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.139948 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="init-config-reloader" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.139955 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="init-config-reloader" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.139999 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="prometheus" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140006 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="prometheus" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.140013 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106c9067-3fff-47b4-a27f-5d2a70d8d976" containerName="mariadb-account-create-update" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140019 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="106c9067-3fff-47b4-a27f-5d2a70d8d976" containerName="mariadb-account-create-update" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140234 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="config-reloader" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140251 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="prometheus" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140259 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" containerName="thanos-sidecar" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.140276 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="106c9067-3fff-47b4-a27f-5d2a70d8d976" containerName="mariadb-account-create-update" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.142256 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.148894 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.149063 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.149177 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.149236 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.150340 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.150528 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.150651 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nxvh4" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.151460 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.152318 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.157663 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.158433 4777 scope.go:117] "RemoveContainer" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.196320 4777 scope.go:117] "RemoveContainer" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.196732 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": container with ID starting with cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118 not found: ID does not exist" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.196765 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118"} err="failed to get container status \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": rpc error: code = NotFound desc = could not find container \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": container with ID starting with cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.196794 4777 scope.go:117] "RemoveContainer" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.196993 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": container with ID starting with 82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194 not found: ID does not exist" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.197012 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194"} err="failed to get container status \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": rpc error: code = NotFound desc = could not find container \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": container with ID starting with 82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.197025 4777 scope.go:117] "RemoveContainer" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.199702 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": container with ID starting with 60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36 not found: ID does not exist" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.199743 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36"} err="failed to get container status \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": rpc error: code = NotFound desc = could not find container \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": container with ID starting with 60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.199758 4777 scope.go:117] "RemoveContainer" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" Feb 16 21:57:45 crc kubenswrapper[4777]: E0216 21:57:45.200068 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": container with ID starting with 5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033 not found: ID does not exist" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200084 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033"} err="failed to get container status \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": rpc error: code = NotFound desc = could not find container \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": container with ID starting with 5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200098 4777 scope.go:117] "RemoveContainer" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200281 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118"} err="failed to get container status \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": rpc error: code = NotFound desc = could not find container \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": container with ID starting with cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200302 4777 scope.go:117] "RemoveContainer" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200910 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194"} err="failed to get container status \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": rpc error: code = NotFound desc = could not find container \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": container with ID starting with 82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.200932 4777 scope.go:117] "RemoveContainer" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202156 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202290 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202381 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202487 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202638 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202770 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6545a-838a-444c-8833-871730be59a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202866 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.202958 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203069 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203153 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203228 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203322 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjjwd\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-kube-api-access-hjjwd\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203414 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203785 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36"} err="failed to get container status \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": rpc error: code = NotFound desc = could not find container \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": container with ID starting with 60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.203878 4777 scope.go:117] "RemoveContainer" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.205517 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033"} err="failed to get container status \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": rpc error: code = NotFound desc = could not find container \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": container with ID starting with 5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.205558 4777 scope.go:117] "RemoveContainer" containerID="cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.208637 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118"} err="failed to get container status \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": rpc error: code = NotFound desc = could not find container \"cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118\": container with ID starting with cad104d14ee14deaf9b17fd02f890e557eb1f25139f225b423e3cd1000448118 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.208670 4777 scope.go:117] "RemoveContainer" containerID="82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.208969 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194"} err="failed to get container status \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": rpc error: code = NotFound desc = could not find container \"82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194\": container with ID starting with 82106fce501ececffac34da2026029bfbbb9b120607f8aeaae8ae83a15db7194 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.208993 4777 scope.go:117] "RemoveContainer" containerID="60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.209800 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36"} err="failed to get container status \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": rpc error: code = NotFound desc = could not find container \"60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36\": container with ID starting with 60a281e1bb3385305f6d599dc9478506a148d3c5777f546fffb1b3f68a09ec36 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.209818 4777 scope.go:117] "RemoveContainer" containerID="5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.210018 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033"} err="failed to get container status \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": rpc error: code = NotFound desc = could not find container \"5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033\": container with ID starting with 5d4d69c6e61293c1743549a5829e557021404c9435eb3c3aa5318457ba0b8033 not found: ID does not exist" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.248299 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-jzwq7" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305412 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305500 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305541 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6545a-838a-444c-8833-871730be59a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305570 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305634 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305658 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305698 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305748 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305805 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjjwd\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-kube-api-access-hjjwd\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305851 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305937 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.305969 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.306008 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.306353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.307335 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.307440 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/03c6545a-838a-444c-8833-871730be59a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.309688 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.309791 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d7e1ae79a57ec286f5c0df29629ccbb0e6df636c5e8b94fad20c5c32a47e117a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.311343 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.312320 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.321961 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.322484 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6545a-838a-444c-8833-871730be59a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.322636 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.322674 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.322675 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.325835 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03c6545a-838a-444c-8833-871730be59a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.334359 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjjwd\" (UniqueName: \"kubernetes.io/projected/03c6545a-838a-444c-8833-871730be59a7-kube-api-access-hjjwd\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.355803 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-42969b9e-7935-49d1-9a41-f57568ab28e3\") pod \"prometheus-metric-storage-0\" (UID: \"03c6545a-838a-444c-8833-871730be59a7\") " pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.438282 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-7jdcb" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.467487 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.573939 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8" Feb 16 21:57:45 crc kubenswrapper[4777]: I0216 21:57:45.948454 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 16 21:57:45 crc kubenswrapper[4777]: W0216 21:57:45.951920 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c6545a_838a_444c_8833_871730be59a7.slice/crio-1c265b9995793f3ff8b233f659a173dd11a7d654d7e0c37550e985ceb0cd0e35 WatchSource:0}: Error finding container 1c265b9995793f3ff8b233f659a173dd11a7d654d7e0c37550e985ceb0cd0e35: Status 404 returned error can't find the container with id 1c265b9995793f3ff8b233f659a173dd11a7d654d7e0c37550e985ceb0cd0e35 Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.051849 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerStarted","Data":"1c265b9995793f3ff8b233f659a173dd11a7d654d7e0c37550e985ceb0cd0e35"} Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.191655 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106c9067-3fff-47b4-a27f-5d2a70d8d976" path="/var/lib/kubelet/pods/106c9067-3fff-47b4-a27f-5d2a70d8d976/volumes" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.192486 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a88f6b0-198b-4bb9-a593-607fae0b0a8d" path="/var/lib/kubelet/pods/2a88f6b0-198b-4bb9-a593-607fae0b0a8d/volumes" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.338245 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.395117 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a882b0c3-7f2e-446b-aea4-476cacffb112" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428179 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428253 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428293 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428322 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428348 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428444 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.428472 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn89z\" (UniqueName: \"kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z\") pod \"f855cb15-8085-43b1-825e-e6316c580924\" (UID: \"f855cb15-8085-43b1-825e-e6316c580924\") " Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.429479 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.429949 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.433147 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z" (OuterVolumeSpecName: "kube-api-access-rn89z") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "kube-api-access-rn89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.437954 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.450951 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.455082 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.457707 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts" (OuterVolumeSpecName: "scripts") pod "f855cb15-8085-43b1-825e-e6316c580924" (UID: "f855cb15-8085-43b1-825e-e6316c580924"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.461591 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530661 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530707 4777 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f855cb15-8085-43b1-825e-e6316c580924-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530739 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530751 4777 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f855cb15-8085-43b1-825e-e6316c580924-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530764 4777 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530775 4777 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f855cb15-8085-43b1-825e-e6316c580924-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.530789 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn89z\" (UniqueName: \"kubernetes.io/projected/f855cb15-8085-43b1-825e-e6316c580924-kube-api-access-rn89z\") on node \"crc\" DevicePath \"\"" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.570533 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 16 21:57:46 crc kubenswrapper[4777]: I0216 21:57:46.646277 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 16 21:57:47 crc kubenswrapper[4777]: I0216 21:57:47.063672 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rjx9d" event={"ID":"f855cb15-8085-43b1-825e-e6316c580924","Type":"ContainerDied","Data":"cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec"} Feb 16 21:57:47 crc kubenswrapper[4777]: I0216 21:57:47.063751 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf8023f37f565ae1d838a955120c01857b8612c9f4a420f6f08500386b47dec" Feb 16 21:57:47 crc kubenswrapper[4777]: I0216 21:57:47.063818 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjx9d" Feb 16 21:57:48 crc kubenswrapper[4777]: I0216 21:57:48.038295 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q5qdr" podUID="7f871748-de9b-43dc-87df-728b12402b6b" containerName="ovn-controller" probeResult="failure" output=< Feb 16 21:57:48 crc kubenswrapper[4777]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 16 21:57:48 crc kubenswrapper[4777]: > Feb 16 21:57:48 crc kubenswrapper[4777]: I0216 21:57:48.096180 4777 generic.go:334] "Generic (PLEG): container finished" podID="4df600f3-97e1-4ac5-980b-2c42aecc5e81" containerID="7f88e90d4a34122af31a7965e83e3daa52c8942cc4703caf9b8d53ede467bef3" exitCode=0 Feb 16 21:57:48 crc kubenswrapper[4777]: I0216 21:57:48.096231 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4df600f3-97e1-4ac5-980b-2c42aecc5e81","Type":"ContainerDied","Data":"7f88e90d4a34122af31a7965e83e3daa52c8942cc4703caf9b8d53ede467bef3"} Feb 16 21:57:48 crc kubenswrapper[4777]: E0216 21:57:48.413427 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9346e4ab_e1d7_42a9_8817_850a2f84e57d.slice/crio-7e00e91aac82460ebf3e6980e1ea18e4e936c85a9c532c3705eb436689217fbd.scope\": RecentStats: unable to find data in memory cache]" Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.107429 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerStarted","Data":"a48e5854440beebf1698d60dc862f5c46263502aef1c9c4c2415614043c1f1b6"} Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.110368 4777 generic.go:334] "Generic (PLEG): container finished" podID="9346e4ab-e1d7-42a9-8817-850a2f84e57d" containerID="7e00e91aac82460ebf3e6980e1ea18e4e936c85a9c532c3705eb436689217fbd" exitCode=0 Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.110431 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9346e4ab-e1d7-42a9-8817-850a2f84e57d","Type":"ContainerDied","Data":"7e00e91aac82460ebf3e6980e1ea18e4e936c85a9c532c3705eb436689217fbd"} Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.120235 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4df600f3-97e1-4ac5-980b-2c42aecc5e81","Type":"ContainerStarted","Data":"0b580f7170f134b677382c4a51d573f1bca0346b146bddafe41a9b411322962c"} Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.120826 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 16 21:57:49 crc kubenswrapper[4777]: I0216 21:57:49.158045 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.67706375 podStartE2EDuration="1m3.158013705s" podCreationTimestamp="2026-02-16 21:56:46 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.257800683 +0000 UTC m=+1143.840301785" lastFinishedPulling="2026-02-16 21:57:12.738750638 +0000 UTC m=+1153.321251740" observedRunningTime="2026-02-16 21:57:49.154299241 +0000 UTC m=+1189.736800343" watchObservedRunningTime="2026-02-16 21:57:49.158013705 +0000 UTC m=+1189.740514807" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.025911 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2xkxn"] Feb 16 21:57:50 crc kubenswrapper[4777]: E0216 21:57:50.026735 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f855cb15-8085-43b1-825e-e6316c580924" containerName="swift-ring-rebalance" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.026822 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f855cb15-8085-43b1-825e-e6316c580924" containerName="swift-ring-rebalance" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.027041 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f855cb15-8085-43b1-825e-e6316c580924" containerName="swift-ring-rebalance" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.027836 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.041437 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2xkxn"] Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.063821 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.130589 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9346e4ab-e1d7-42a9-8817-850a2f84e57d","Type":"ContainerStarted","Data":"c082f5166f1f32334e9cc42f48f47d2bad03f3e4ae0ca5e605adbae1c399cbe9"} Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.132457 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.140832 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.141081 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhf4\" (UniqueName: \"kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.158290 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.804007015 podStartE2EDuration="1m3.158274449s" podCreationTimestamp="2026-02-16 21:56:47 +0000 UTC" firstStartedPulling="2026-02-16 21:57:03.169568722 +0000 UTC m=+1143.752069824" lastFinishedPulling="2026-02-16 21:57:13.523836146 +0000 UTC m=+1154.106337258" observedRunningTime="2026-02-16 21:57:50.152020384 +0000 UTC m=+1190.734521486" watchObservedRunningTime="2026-02-16 21:57:50.158274449 +0000 UTC m=+1190.740775551" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.242538 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.243040 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhf4\" (UniqueName: \"kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.243352 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.263400 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhf4\" (UniqueName: \"kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4\") pod \"root-account-create-update-2xkxn\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:50 crc kubenswrapper[4777]: I0216 21:57:50.385617 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:52 crc kubenswrapper[4777]: I0216 21:57:52.925308 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q5qdr" podUID="7f871748-de9b-43dc-87df-728b12402b6b" containerName="ovn-controller" probeResult="failure" output=< Feb 16 21:57:52 crc kubenswrapper[4777]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 16 21:57:52 crc kubenswrapper[4777]: > Feb 16 21:57:52 crc kubenswrapper[4777]: I0216 21:57:52.947852 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:57:52 crc kubenswrapper[4777]: I0216 21:57:52.977258 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7wbfs" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.177236 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5qdr-config-8crl9"] Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.179861 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.193273 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.193915 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr-config-8crl9"] Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304554 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304610 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304671 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304703 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304864 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6v2n\" (UniqueName: \"kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.304904 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.407781 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.407821 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.407907 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.407944 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.407977 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6v2n\" (UniqueName: \"kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.408000 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.408889 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.409138 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.411536 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.411598 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.412312 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.432781 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6v2n\" (UniqueName: \"kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n\") pod \"ovn-controller-q5qdr-config-8crl9\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:53 crc kubenswrapper[4777]: I0216 21:57:53.498962 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:57:55 crc kubenswrapper[4777]: I0216 21:57:55.190681 4777 generic.go:334] "Generic (PLEG): container finished" podID="03c6545a-838a-444c-8833-871730be59a7" containerID="a48e5854440beebf1698d60dc862f5c46263502aef1c9c4c2415614043c1f1b6" exitCode=0 Feb 16 21:57:55 crc kubenswrapper[4777]: I0216 21:57:55.190742 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerDied","Data":"a48e5854440beebf1698d60dc862f5c46263502aef1c9c4c2415614043c1f1b6"} Feb 16 21:57:56 crc kubenswrapper[4777]: I0216 21:57:56.396406 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a882b0c3-7f2e-446b-aea4-476cacffb112" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 16 21:57:56 crc kubenswrapper[4777]: I0216 21:57:56.871594 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:56 crc kubenswrapper[4777]: I0216 21:57:56.875601 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3392d073-5de5-4f7e-ae87-e892f769157a-etc-swift\") pod \"swift-storage-0\" (UID: \"3392d073-5de5-4f7e-ae87-e892f769157a\") " pod="openstack/swift-storage-0" Feb 16 21:57:56 crc kubenswrapper[4777]: I0216 21:57:56.970595 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2xkxn"] Feb 16 21:57:56 crc kubenswrapper[4777]: W0216 21:57:56.976456 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbae3fdc_fa7e_41bc_8c73_8d126b476ba5.slice/crio-1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9 WatchSource:0}: Error finding container 1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9: Status 404 returned error can't find the container with id 1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9 Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.036472 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.069914 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr-config-8crl9"] Feb 16 21:57:57 crc kubenswrapper[4777]: W0216 21:57:57.076951 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8879ecb0_53c9_4a75_993c_10f5b0c6cbba.slice/crio-a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1 WatchSource:0}: Error finding container a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1: Status 404 returned error can't find the container with id a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1 Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.210898 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2xkxn" event={"ID":"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5","Type":"ContainerStarted","Data":"517f476878727fab47fc8e4916d29876152c9fc73b1534fed166306ffeff1cc1"} Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.210939 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2xkxn" event={"ID":"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5","Type":"ContainerStarted","Data":"1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9"} Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.214101 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-8crl9" event={"ID":"8879ecb0-53c9-4a75-993c-10f5b0c6cbba","Type":"ContainerStarted","Data":"a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1"} Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.216635 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerStarted","Data":"e43ec331fe2537d9539d25d86c828bfe923b172156376f9044fbba7e9f523809"} Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.722300 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2xkxn" podStartSLOduration=7.722273985 podStartE2EDuration="7.722273985s" podCreationTimestamp="2026-02-16 21:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:57:57.233467044 +0000 UTC m=+1197.815968146" watchObservedRunningTime="2026-02-16 21:57:57.722273985 +0000 UTC m=+1198.304775127" Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.735216 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 16 21:57:57 crc kubenswrapper[4777]: W0216 21:57:57.778584 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3392d073_5de5_4f7e_ae87_e892f769157a.slice/crio-741eb290ff2b1ce110defdce232cc086281a3de919cb8d5cb70c317d9f39a44d WatchSource:0}: Error finding container 741eb290ff2b1ce110defdce232cc086281a3de919cb8d5cb70c317d9f39a44d: Status 404 returned error can't find the container with id 741eb290ff2b1ce110defdce232cc086281a3de919cb8d5cb70c317d9f39a44d Feb 16 21:57:57 crc kubenswrapper[4777]: I0216 21:57:57.932033 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-q5qdr" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.242801 4777 generic.go:334] "Generic (PLEG): container finished" podID="bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" containerID="517f476878727fab47fc8e4916d29876152c9fc73b1534fed166306ffeff1cc1" exitCode=0 Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.242852 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2xkxn" event={"ID":"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5","Type":"ContainerDied","Data":"517f476878727fab47fc8e4916d29876152c9fc73b1534fed166306ffeff1cc1"} Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.246528 4777 generic.go:334] "Generic (PLEG): container finished" podID="8879ecb0-53c9-4a75-993c-10f5b0c6cbba" containerID="4630e87ac1c542d44341d25d6197690ce1103bedcb540f3ca3b1cf8df95d5ed8" exitCode=0 Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.246610 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-8crl9" event={"ID":"8879ecb0-53c9-4a75-993c-10f5b0c6cbba","Type":"ContainerDied","Data":"4630e87ac1c542d44341d25d6197690ce1103bedcb540f3ca3b1cf8df95d5ed8"} Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.251407 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"741eb290ff2b1ce110defdce232cc086281a3de919cb8d5cb70c317d9f39a44d"} Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.255392 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fwhrq" event={"ID":"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c","Type":"ContainerStarted","Data":"8c3c4e6df7d757d6543c535e83a20260f43671d7f30177c3e175f3c230eca68a"} Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.295005 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fwhrq" podStartSLOduration=3.075790674 podStartE2EDuration="16.294987085s" podCreationTimestamp="2026-02-16 21:57:42 +0000 UTC" firstStartedPulling="2026-02-16 21:57:43.324001542 +0000 UTC m=+1183.906502644" lastFinishedPulling="2026-02-16 21:57:56.543197943 +0000 UTC m=+1197.125699055" observedRunningTime="2026-02-16 21:57:58.282604138 +0000 UTC m=+1198.865105320" watchObservedRunningTime="2026-02-16 21:57:58.294987085 +0000 UTC m=+1198.877488187" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.346066 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.723378 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-q9pnh"] Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.728763 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.750681 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q9pnh"] Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.806666 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.806758 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9z7z\" (UniqueName: \"kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.845146 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6597-account-create-update-8t7kc"] Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.846373 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.852354 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.855492 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6597-account-create-update-8t7kc"] Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.908598 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p2bv\" (UniqueName: \"kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.908881 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.908993 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9z7z\" (UniqueName: \"kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.909064 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.910106 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.926077 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-dqt5c"] Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.927360 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.933155 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9z7z\" (UniqueName: \"kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z\") pod \"cinder-db-create-q9pnh\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:58 crc kubenswrapper[4777]: I0216 21:57:58.943682 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-dqt5c"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.012680 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p2bv\" (UniqueName: \"kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.013294 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.013390 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.013474 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92rc\" (UniqueName: \"kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.021226 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.033314 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wxbd4"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.062336 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9pnh" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.063351 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p2bv\" (UniqueName: \"kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv\") pod \"cinder-6597-account-create-update-8t7kc\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.091452 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wxbd4"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.091818 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.101467 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4faf-account-create-update-xz29p"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.102952 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4faf-account-create-update-xz29p"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.103023 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.109853 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.115170 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.115359 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92rc\" (UniqueName: \"kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.116353 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.144206 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92rc\" (UniqueName: \"kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc\") pod \"cloudkitty-db-create-dqt5c\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.146368 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-h4sfw"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.147650 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.152467 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.152537 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.152701 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgg5d" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.153031 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.155560 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h4sfw"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.180512 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217744 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk8jh\" (UniqueName: \"kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217799 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvnd\" (UniqueName: \"kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217828 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217868 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5b86\" (UniqueName: \"kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217908 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217942 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.217970 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.316867 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-c9w56"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.318193 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.318312 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.321632 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.321692 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.321730 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.321863 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk8jh\" (UniqueName: \"kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.321892 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvnd\" (UniqueName: \"kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.322579 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.322598 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.322697 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5b86\" (UniqueName: \"kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.323067 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.338739 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.347268 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5b86\" (UniqueName: \"kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.349641 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk8jh\" (UniqueName: \"kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh\") pod \"barbican-db-create-wxbd4\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.351894 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-20bd-account-create-update-pdrs4"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.352763 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvnd\" (UniqueName: \"kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd\") pod \"barbican-4faf-account-create-update-xz29p\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.353193 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.356968 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.371453 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data\") pod \"keystone-db-sync-h4sfw\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.382159 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c9w56"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.405488 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-20bd-account-create-update-pdrs4"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.424275 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-feeb-account-create-update-kf5rr"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.425335 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.426498 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.426553 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5fn\" (UniqueName: \"kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.426577 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbtb\" (UniqueName: \"kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.426597 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.428032 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.433650 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-feeb-account-create-update-kf5rr"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.517331 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wxbd4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.527412 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528643 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjlt\" (UniqueName: \"kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528799 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528875 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528912 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5fn\" (UniqueName: \"kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528938 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbtb\" (UniqueName: \"kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.528962 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.529656 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.529673 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.547628 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5fn\" (UniqueName: \"kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn\") pod \"neutron-db-create-c9w56\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.548178 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbtb\" (UniqueName: \"kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb\") pod \"neutron-20bd-account-create-update-pdrs4\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.551947 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.644843 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjlt\" (UniqueName: \"kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.644982 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.645899 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.669335 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjlt\" (UniqueName: \"kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt\") pod \"cloudkitty-feeb-account-create-update-kf5rr\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.685259 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c9w56" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.689984 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.697847 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q9pnh"] Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.759747 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:57:59 crc kubenswrapper[4777]: W0216 21:57:59.837950 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61160d7e_1d8e_47fa_94c6_02c654559bca.slice/crio-0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58 WatchSource:0}: Error finding container 0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58: Status 404 returned error can't find the container with id 0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58 Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.845551 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2xkxn" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.952303 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts\") pod \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.952602 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnhf4\" (UniqueName: \"kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4\") pod \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\" (UID: \"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5\") " Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.953376 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" (UID: "bbae3fdc-fa7e-41bc-8c73-8d126b476ba5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:57:59 crc kubenswrapper[4777]: I0216 21:57:59.959848 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4" (OuterVolumeSpecName: "kube-api-access-jnhf4") pod "bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" (UID: "bbae3fdc-fa7e-41bc-8c73-8d126b476ba5"). InnerVolumeSpecName "kube-api-access-jnhf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.064838 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.064867 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnhf4\" (UniqueName: \"kubernetes.io/projected/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5-kube-api-access-jnhf4\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.296590 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-8crl9" event={"ID":"8879ecb0-53c9-4a75-993c-10f5b0c6cbba","Type":"ContainerDied","Data":"a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1"} Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.296864 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a73b78565c2ce60a27cc557aaa49f383fe36f2fa31ced84d6c026692ef06e7c1" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.297920 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9pnh" event={"ID":"61160d7e-1d8e-47fa-94c6-02c654559bca","Type":"ContainerStarted","Data":"0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58"} Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.308691 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2xkxn" event={"ID":"bbae3fdc-fa7e-41bc-8c73-8d126b476ba5","Type":"ContainerDied","Data":"1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9"} Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.308759 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b2dcbb25624b0e475a6e7c832d0180bf63e6a9f18bf76bb0184778d1e4adde9" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.308842 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2xkxn" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.510854 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584309 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584388 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584429 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584482 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6v2n\" (UniqueName: \"kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584499 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.584573 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn\") pod \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\" (UID: \"8879ecb0-53c9-4a75-993c-10f5b0c6cbba\") " Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.585020 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.588066 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run" (OuterVolumeSpecName: "var-run") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.588126 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.590818 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.591198 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts" (OuterVolumeSpecName: "scripts") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:00 crc kubenswrapper[4777]: I0216 21:58:00.605762 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n" (OuterVolumeSpecName: "kube-api-access-c6v2n") pod "8879ecb0-53c9-4a75-993c-10f5b0c6cbba" (UID: "8879ecb0-53c9-4a75-993c-10f5b0c6cbba"). InnerVolumeSpecName "kube-api-access-c6v2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686605 4777 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686633 4777 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686643 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6v2n\" (UniqueName: \"kubernetes.io/projected/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-kube-api-access-c6v2n\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686654 4777 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686663 4777 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.686670 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8879ecb0-53c9-4a75-993c-10f5b0c6cbba-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.824729 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6597-account-create-update-8t7kc"] Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.854731 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.885280 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-20bd-account-create-update-pdrs4"] Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:00.956812 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.318350 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"a1f0a77dd928b08bbfa275e454adfb40a716a1038ad585be1fa3e4d7f68e1ad4"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.318388 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"18ee2ab852ed8db6a26da7e1890dffa1b91f4824350d556afe650879a48d831c"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.322651 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6597-account-create-update-8t7kc" event={"ID":"e859b095-dbc8-4bea-8d75-e576976dffaa","Type":"ContainerStarted","Data":"97ddbf1da018ec47aa1e601737e5ff32b412d656f33dc531197a111f70258c46"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.322676 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6597-account-create-update-8t7kc" event={"ID":"e859b095-dbc8-4bea-8d75-e576976dffaa","Type":"ContainerStarted","Data":"d5a83c16b065b9d54417b6d47a0bed3bf735277da0769d7dc0186b11b6dff5c8"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.328348 4777 generic.go:334] "Generic (PLEG): container finished" podID="61160d7e-1d8e-47fa-94c6-02c654559bca" containerID="62efaf26c417064b51a69213ea0542cc2e59224f3af56ea1395dbeb01d212e7a" exitCode=0 Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.328487 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9pnh" event={"ID":"61160d7e-1d8e-47fa-94c6-02c654559bca","Type":"ContainerDied","Data":"62efaf26c417064b51a69213ea0542cc2e59224f3af56ea1395dbeb01d212e7a"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.357904 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-8crl9" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.357974 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-20bd-account-create-update-pdrs4" event={"ID":"b567a40e-3ae3-4d2d-be6c-25d0e1175711","Type":"ContainerStarted","Data":"8431a8095a25d5d0417790cd948dc752bc6f939f39142f76bfd175d99efadf92"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.358000 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-20bd-account-create-update-pdrs4" event={"ID":"b567a40e-3ae3-4d2d-be6c-25d0e1175711","Type":"ContainerStarted","Data":"e70162cbbbece2d32540d4b40bed1cd99cf769b51bb99ed55e03ba6134689e96"} Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.365342 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-6597-account-create-update-8t7kc" podStartSLOduration=3.365325356 podStartE2EDuration="3.365325356s" podCreationTimestamp="2026-02-16 21:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:01.33976816 +0000 UTC m=+1201.922269262" watchObservedRunningTime="2026-02-16 21:58:01.365325356 +0000 UTC m=+1201.947826458" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.379955 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-20bd-account-create-update-pdrs4" podStartSLOduration=2.379938895 podStartE2EDuration="2.379938895s" podCreationTimestamp="2026-02-16 21:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:01.378904546 +0000 UTC m=+1201.961405648" watchObservedRunningTime="2026-02-16 21:58:01.379938895 +0000 UTC m=+1201.962439997" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.628988 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q5qdr-config-8crl9"] Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.642798 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q5qdr-config-8crl9"] Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.703528 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5qdr-config-q8w94"] Feb 16 21:58:01 crc kubenswrapper[4777]: E0216 21:58:01.703852 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" containerName="mariadb-account-create-update" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.703864 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" containerName="mariadb-account-create-update" Feb 16 21:58:01 crc kubenswrapper[4777]: E0216 21:58:01.703878 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8879ecb0-53c9-4a75-993c-10f5b0c6cbba" containerName="ovn-config" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.703884 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="8879ecb0-53c9-4a75-993c-10f5b0c6cbba" containerName="ovn-config" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.704061 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" containerName="mariadb-account-create-update" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.704080 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="8879ecb0-53c9-4a75-993c-10f5b0c6cbba" containerName="ovn-config" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.704618 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.706343 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.725734 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr-config-q8w94"] Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.812823 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.812853 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.812909 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92vx\" (UniqueName: \"kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.813002 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.813054 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.813073 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915795 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915838 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915866 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92vx\" (UniqueName: \"kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915918 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915938 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.915962 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.916244 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.916299 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.916521 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.917469 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.918080 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:01 crc kubenswrapper[4777]: I0216 21:58:01.988549 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92vx\" (UniqueName: \"kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx\") pod \"ovn-controller-q5qdr-config-q8w94\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.017688 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.192928 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8879ecb0-53c9-4a75-993c-10f5b0c6cbba" path="/var/lib/kubelet/pods/8879ecb0-53c9-4a75-993c-10f5b0c6cbba/volumes" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.203691 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4faf-account-create-update-xz29p"] Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.247522 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.368949 4777 generic.go:334] "Generic (PLEG): container finished" podID="e859b095-dbc8-4bea-8d75-e576976dffaa" containerID="97ddbf1da018ec47aa1e601737e5ff32b412d656f33dc531197a111f70258c46" exitCode=0 Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.369212 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6597-account-create-update-8t7kc" event={"ID":"e859b095-dbc8-4bea-8d75-e576976dffaa","Type":"ContainerDied","Data":"97ddbf1da018ec47aa1e601737e5ff32b412d656f33dc531197a111f70258c46"} Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.376409 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"9c93119187c9a9e8e198689b11d00bc119e94cd9af55bf333f37104e0e7b08cf"} Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.376446 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"49032692ecf8204bdbb34de0eea60c280501d607be84fadf73cbe9c261787e14"} Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.377956 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4faf-account-create-update-xz29p" event={"ID":"cf2488e5-f431-4464-b9f9-f549322948d1","Type":"ContainerStarted","Data":"16d17a2b73af85d8d71e45b9447dbc094532647102b47f5f52dff6ef523f6989"} Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.523638 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h4sfw"] Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.537564 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c9w56"] Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.550281 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-dqt5c"] Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.557911 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-feeb-account-create-update-kf5rr"] Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.564413 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.582189 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wxbd4"] Feb 16 21:58:02 crc kubenswrapper[4777]: W0216 21:58:02.596758 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68c19c4b_cd3c_43b7_bb94_dba6086f3980.slice/crio-7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f WatchSource:0}: Error finding container 7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f: Status 404 returned error can't find the container with id 7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.619285 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5qdr-config-q8w94"] Feb 16 21:58:02 crc kubenswrapper[4777]: W0216 21:58:02.626018 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e85b51_e3b8_4542_a9ba_3350e5abb69d.slice/crio-6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b WatchSource:0}: Error finding container 6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b: Status 404 returned error can't find the container with id 6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.735773 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9pnh" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.840425 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts\") pod \"61160d7e-1d8e-47fa-94c6-02c654559bca\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.840645 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9z7z\" (UniqueName: \"kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z\") pod \"61160d7e-1d8e-47fa-94c6-02c654559bca\" (UID: \"61160d7e-1d8e-47fa-94c6-02c654559bca\") " Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.841011 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61160d7e-1d8e-47fa-94c6-02c654559bca" (UID: "61160d7e-1d8e-47fa-94c6-02c654559bca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.846833 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z" (OuterVolumeSpecName: "kube-api-access-d9z7z") pod "61160d7e-1d8e-47fa-94c6-02c654559bca" (UID: "61160d7e-1d8e-47fa-94c6-02c654559bca"). InnerVolumeSpecName "kube-api-access-d9z7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.942177 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9z7z\" (UniqueName: \"kubernetes.io/projected/61160d7e-1d8e-47fa-94c6-02c654559bca-kube-api-access-d9z7z\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:02 crc kubenswrapper[4777]: I0216 21:58:02.942204 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61160d7e-1d8e-47fa-94c6-02c654559bca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.389690 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h4sfw" event={"ID":"5dde5180-41b7-4ccd-b5bf-d144b205d163","Type":"ContainerStarted","Data":"3e3782defba4411e31f1b5f7d5df5e9fb05b740107240e4f1cae8049c65ff34f"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.391208 4777 generic.go:334] "Generic (PLEG): container finished" podID="68c19c4b-cd3c-43b7-bb94-dba6086f3980" containerID="2cbedc7a5b8b00e413c87ba7d9ebd19f5a7d93ebb97a6344216d412100b2034d" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.391247 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wxbd4" event={"ID":"68c19c4b-cd3c-43b7-bb94-dba6086f3980","Type":"ContainerDied","Data":"2cbedc7a5b8b00e413c87ba7d9ebd19f5a7d93ebb97a6344216d412100b2034d"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.391261 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wxbd4" event={"ID":"68c19c4b-cd3c-43b7-bb94-dba6086f3980","Type":"ContainerStarted","Data":"7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.392860 4777 generic.go:334] "Generic (PLEG): container finished" podID="83e85b51-e3b8-4542-a9ba-3350e5abb69d" containerID="510f6b0d603b1f6f11444e526b6aaf6eb9e02c8b5586a53aaad0c39ba9014898" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.392953 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-q8w94" event={"ID":"83e85b51-e3b8-4542-a9ba-3350e5abb69d","Type":"ContainerDied","Data":"510f6b0d603b1f6f11444e526b6aaf6eb9e02c8b5586a53aaad0c39ba9014898"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.392974 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-q8w94" event={"ID":"83e85b51-e3b8-4542-a9ba-3350e5abb69d","Type":"ContainerStarted","Data":"6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.395068 4777 generic.go:334] "Generic (PLEG): container finished" podID="6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" containerID="23005eefba37937f6630df4696c151df8a7e93d03a7172ab3fa1f13f5dc8d1f7" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.395106 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" event={"ID":"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f","Type":"ContainerDied","Data":"23005eefba37937f6630df4696c151df8a7e93d03a7172ab3fa1f13f5dc8d1f7"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.395120 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" event={"ID":"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f","Type":"ContainerStarted","Data":"dc5acfee6bb139409c2a16cae1d952421e386a997817b142dd99c3ccfe70de41"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.396684 4777 generic.go:334] "Generic (PLEG): container finished" podID="e5063ed0-49f5-4947-af98-867246696986" containerID="f5c6f3186946a134960253907604d5742844e72df76628471d7423224b92ab9d" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.396738 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-dqt5c" event={"ID":"e5063ed0-49f5-4947-af98-867246696986","Type":"ContainerDied","Data":"f5c6f3186946a134960253907604d5742844e72df76628471d7423224b92ab9d"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.396752 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-dqt5c" event={"ID":"e5063ed0-49f5-4947-af98-867246696986","Type":"ContainerStarted","Data":"23d640cb697a59b26071a28154411ce6b78bc902b540d8615fa36c0d61a0bff9"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.398288 4777 generic.go:334] "Generic (PLEG): container finished" podID="cf2488e5-f431-4464-b9f9-f549322948d1" containerID="e17565b4ac5f1ccb765fc317a61ae3cb0a819c77fa7ff30afb09275ac294991e" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.398368 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4faf-account-create-update-xz29p" event={"ID":"cf2488e5-f431-4464-b9f9-f549322948d1","Type":"ContainerDied","Data":"e17565b4ac5f1ccb765fc317a61ae3cb0a819c77fa7ff30afb09275ac294991e"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.399659 4777 generic.go:334] "Generic (PLEG): container finished" podID="b567a40e-3ae3-4d2d-be6c-25d0e1175711" containerID="8431a8095a25d5d0417790cd948dc752bc6f939f39142f76bfd175d99efadf92" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.399696 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-20bd-account-create-update-pdrs4" event={"ID":"b567a40e-3ae3-4d2d-be6c-25d0e1175711","Type":"ContainerDied","Data":"8431a8095a25d5d0417790cd948dc752bc6f939f39142f76bfd175d99efadf92"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.400866 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q9pnh" event={"ID":"61160d7e-1d8e-47fa-94c6-02c654559bca","Type":"ContainerDied","Data":"0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.400886 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f37a8cc056e3206bedced5f4adf3e1313921d04477c20c5642dac4551da5b58" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.400918 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q9pnh" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.403160 4777 generic.go:334] "Generic (PLEG): container finished" podID="5dc41bf6-4400-47d8-bada-837e28d0d42c" containerID="3276a0c72c1bf3608f60770961e05b9c8139e26f26c6943340511ee32613ca2d" exitCode=0 Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.403375 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c9w56" event={"ID":"5dc41bf6-4400-47d8-bada-837e28d0d42c","Type":"ContainerDied","Data":"3276a0c72c1bf3608f60770961e05b9c8139e26f26c6943340511ee32613ca2d"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.403405 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c9w56" event={"ID":"5dc41bf6-4400-47d8-bada-837e28d0d42c","Type":"ContainerStarted","Data":"0fb564b7a4015d33d4f931944535abe81fd055cb1236375cb4606b8bed8ea599"} Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.739753 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.873831 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts\") pod \"e859b095-dbc8-4bea-8d75-e576976dffaa\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.873952 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p2bv\" (UniqueName: \"kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv\") pod \"e859b095-dbc8-4bea-8d75-e576976dffaa\" (UID: \"e859b095-dbc8-4bea-8d75-e576976dffaa\") " Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.874563 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e859b095-dbc8-4bea-8d75-e576976dffaa" (UID: "e859b095-dbc8-4bea-8d75-e576976dffaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.877143 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv" (OuterVolumeSpecName: "kube-api-access-5p2bv") pod "e859b095-dbc8-4bea-8d75-e576976dffaa" (UID: "e859b095-dbc8-4bea-8d75-e576976dffaa"). InnerVolumeSpecName "kube-api-access-5p2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.975917 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e859b095-dbc8-4bea-8d75-e576976dffaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:03 crc kubenswrapper[4777]: I0216 21:58:03.975953 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p2bv\" (UniqueName: \"kubernetes.io/projected/e859b095-dbc8-4bea-8d75-e576976dffaa-kube-api-access-5p2bv\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.422683 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6597-account-create-update-8t7kc" event={"ID":"e859b095-dbc8-4bea-8d75-e576976dffaa","Type":"ContainerDied","Data":"d5a83c16b065b9d54417b6d47a0bed3bf735277da0769d7dc0186b11b6dff5c8"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.422798 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a83c16b065b9d54417b6d47a0bed3bf735277da0769d7dc0186b11b6dff5c8" Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.422769 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6597-account-create-update-8t7kc" Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.432120 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"f9d9b7bad799b434fd8369850ffad33c10cc40f508a1d8b874eefe0df1526960"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.432163 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"464db9a541c72bfd7723c7a44541b2a8239e151c2965c7d21d55f86aedfc965c"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.432173 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"7f031f3a004a3c3d1866ea3d60b48d7ee6601764e30913fa4129cb928dec4186"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.432184 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"88f09b52b2a67e41d1b52df01816194345cca292e0bcc4653f3b3097c906bedd"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.435889 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerStarted","Data":"dc898c01403274b4b805c549d8cd8b097c604ea545e4b31539f34ddb77afe9ca"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.435925 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"03c6545a-838a-444c-8833-871730be59a7","Type":"ContainerStarted","Data":"7acb8127d529b782cf865aa5a3783c4a14f74ab363378c77cd4d7a97d41a715f"} Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.472146 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.472128618 podStartE2EDuration="19.472128618s" podCreationTimestamp="2026-02-16 21:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:04.468001993 +0000 UTC m=+1205.050503095" watchObservedRunningTime="2026-02-16 21:58:04.472128618 +0000 UTC m=+1205.054629720" Feb 16 21:58:04 crc kubenswrapper[4777]: I0216 21:58:04.900813 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.009731 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts\") pod \"cf2488e5-f431-4464-b9f9-f549322948d1\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.009801 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvnd\" (UniqueName: \"kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd\") pod \"cf2488e5-f431-4464-b9f9-f549322948d1\" (UID: \"cf2488e5-f431-4464-b9f9-f549322948d1\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.011213 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf2488e5-f431-4464-b9f9-f549322948d1" (UID: "cf2488e5-f431-4464-b9f9-f549322948d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.033825 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd" (OuterVolumeSpecName: "kube-api-access-hpvnd") pod "cf2488e5-f431-4464-b9f9-f549322948d1" (UID: "cf2488e5-f431-4464-b9f9-f549322948d1"). InnerVolumeSpecName "kube-api-access-hpvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.112046 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2488e5-f431-4464-b9f9-f549322948d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.112074 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvnd\" (UniqueName: \"kubernetes.io/projected/cf2488e5-f431-4464-b9f9-f549322948d1-kube-api-access-hpvnd\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.186581 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wxbd4" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.191917 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.214029 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.216759 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.222809 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c9w56" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.232467 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.314936 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315065 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315096 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315118 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts\") pod \"e5063ed0-49f5-4947-af98-867246696986\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315153 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f92vx\" (UniqueName: \"kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315170 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts\") pod \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315193 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315179 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315215 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zbtb\" (UniqueName: \"kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb\") pod \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315190 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315233 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxjlt\" (UniqueName: \"kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt\") pod \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\" (UID: \"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315262 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92rc\" (UniqueName: \"kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc\") pod \"e5063ed0-49f5-4947-af98-867246696986\" (UID: \"e5063ed0-49f5-4947-af98-867246696986\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315301 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts\") pod \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\" (UID: \"b567a40e-3ae3-4d2d-be6c-25d0e1175711\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315325 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run\") pod \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\" (UID: \"83e85b51-e3b8-4542-a9ba-3350e5abb69d\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315356 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng5fn\" (UniqueName: \"kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn\") pod \"5dc41bf6-4400-47d8-bada-837e28d0d42c\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315382 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts\") pod \"5dc41bf6-4400-47d8-bada-837e28d0d42c\" (UID: \"5dc41bf6-4400-47d8-bada-837e28d0d42c\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315405 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk8jh\" (UniqueName: \"kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh\") pod \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315496 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts\") pod \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\" (UID: \"68c19c4b-cd3c-43b7-bb94-dba6086f3980\") " Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315937 4777 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315962 4777 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.316216 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.316338 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" (UID: "6829d7a1-c6c5-4e57-be05-9b6335f1ad0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.316795 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dc41bf6-4400-47d8-bada-837e28d0d42c" (UID: "5dc41bf6-4400-47d8-bada-837e28d0d42c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.317018 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b567a40e-3ae3-4d2d-be6c-25d0e1175711" (UID: "b567a40e-3ae3-4d2d-be6c-25d0e1175711"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.317225 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run" (OuterVolumeSpecName: "var-run") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.315699 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5063ed0-49f5-4947-af98-867246696986" (UID: "e5063ed0-49f5-4947-af98-867246696986"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.317352 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68c19c4b-cd3c-43b7-bb94-dba6086f3980" (UID: "68c19c4b-cd3c-43b7-bb94-dba6086f3980"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.318175 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts" (OuterVolumeSpecName: "scripts") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.320215 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh" (OuterVolumeSpecName: "kube-api-access-jk8jh") pod "68c19c4b-cd3c-43b7-bb94-dba6086f3980" (UID: "68c19c4b-cd3c-43b7-bb94-dba6086f3980"). InnerVolumeSpecName "kube-api-access-jk8jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.320849 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc" (OuterVolumeSpecName: "kube-api-access-h92rc") pod "e5063ed0-49f5-4947-af98-867246696986" (UID: "e5063ed0-49f5-4947-af98-867246696986"). InnerVolumeSpecName "kube-api-access-h92rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.321451 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn" (OuterVolumeSpecName: "kube-api-access-ng5fn") pod "5dc41bf6-4400-47d8-bada-837e28d0d42c" (UID: "5dc41bf6-4400-47d8-bada-837e28d0d42c"). InnerVolumeSpecName "kube-api-access-ng5fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.326932 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt" (OuterVolumeSpecName: "kube-api-access-nxjlt") pod "6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" (UID: "6829d7a1-c6c5-4e57-be05-9b6335f1ad0f"). InnerVolumeSpecName "kube-api-access-nxjlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.328772 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx" (OuterVolumeSpecName: "kube-api-access-f92vx") pod "83e85b51-e3b8-4542-a9ba-3350e5abb69d" (UID: "83e85b51-e3b8-4542-a9ba-3350e5abb69d"). InnerVolumeSpecName "kube-api-access-f92vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.338623 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb" (OuterVolumeSpecName: "kube-api-access-5zbtb") pod "b567a40e-3ae3-4d2d-be6c-25d0e1175711" (UID: "b567a40e-3ae3-4d2d-be6c-25d0e1175711"). InnerVolumeSpecName "kube-api-access-5zbtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417573 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567a40e-3ae3-4d2d-be6c-25d0e1175711-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417624 4777 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/83e85b51-e3b8-4542-a9ba-3350e5abb69d-var-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417639 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng5fn\" (UniqueName: \"kubernetes.io/projected/5dc41bf6-4400-47d8-bada-837e28d0d42c-kube-api-access-ng5fn\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417655 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dc41bf6-4400-47d8-bada-837e28d0d42c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417668 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk8jh\" (UniqueName: \"kubernetes.io/projected/68c19c4b-cd3c-43b7-bb94-dba6086f3980-kube-api-access-jk8jh\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417679 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c19c4b-cd3c-43b7-bb94-dba6086f3980-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417692 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417705 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5063ed0-49f5-4947-af98-867246696986-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417736 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f92vx\" (UniqueName: \"kubernetes.io/projected/83e85b51-e3b8-4542-a9ba-3350e5abb69d-kube-api-access-f92vx\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417748 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417761 4777 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/83e85b51-e3b8-4542-a9ba-3350e5abb69d-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417773 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zbtb\" (UniqueName: \"kubernetes.io/projected/b567a40e-3ae3-4d2d-be6c-25d0e1175711-kube-api-access-5zbtb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417786 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxjlt\" (UniqueName: \"kubernetes.io/projected/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f-kube-api-access-nxjlt\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.417798 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92rc\" (UniqueName: \"kubernetes.io/projected/e5063ed0-49f5-4947-af98-867246696986-kube-api-access-h92rc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.446162 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-20bd-account-create-update-pdrs4" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.446161 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-20bd-account-create-update-pdrs4" event={"ID":"b567a40e-3ae3-4d2d-be6c-25d0e1175711","Type":"ContainerDied","Data":"e70162cbbbece2d32540d4b40bed1cd99cf769b51bb99ed55e03ba6134689e96"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.446306 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e70162cbbbece2d32540d4b40bed1cd99cf769b51bb99ed55e03ba6134689e96" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.447674 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wxbd4" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.447674 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wxbd4" event={"ID":"68c19c4b-cd3c-43b7-bb94-dba6086f3980","Type":"ContainerDied","Data":"7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.447790 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea6985b84fe6b2673228895f8297da11888c9df178fb57ce3bcf32f376aea7f" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.452199 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5qdr-config-q8w94" event={"ID":"83e85b51-e3b8-4542-a9ba-3350e5abb69d","Type":"ContainerDied","Data":"6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.452230 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6140f4002553a27a5e61fbcf1af0d0ab788ffdadd9100a0bbbe302ca5edb338b" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.452206 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5qdr-config-q8w94" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.453885 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" event={"ID":"6829d7a1-c6c5-4e57-be05-9b6335f1ad0f","Type":"ContainerDied","Data":"dc5acfee6bb139409c2a16cae1d952421e386a997817b142dd99c3ccfe70de41"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.453910 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5acfee6bb139409c2a16cae1d952421e386a997817b142dd99c3ccfe70de41" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.453973 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-feeb-account-create-update-kf5rr" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.456246 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-dqt5c" event={"ID":"e5063ed0-49f5-4947-af98-867246696986","Type":"ContainerDied","Data":"23d640cb697a59b26071a28154411ce6b78bc902b540d8615fa36c0d61a0bff9"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.456351 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23d640cb697a59b26071a28154411ce6b78bc902b540d8615fa36c0d61a0bff9" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.456674 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-dqt5c" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.460139 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c9w56" event={"ID":"5dc41bf6-4400-47d8-bada-837e28d0d42c","Type":"ContainerDied","Data":"0fb564b7a4015d33d4f931944535abe81fd055cb1236375cb4606b8bed8ea599"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.460176 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb564b7a4015d33d4f931944535abe81fd055cb1236375cb4606b8bed8ea599" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.460175 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c9w56" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.462009 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4faf-account-create-update-xz29p" event={"ID":"cf2488e5-f431-4464-b9f9-f549322948d1","Type":"ContainerDied","Data":"16d17a2b73af85d8d71e45b9447dbc094532647102b47f5f52dff6ef523f6989"} Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.462134 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16d17a2b73af85d8d71e45b9447dbc094532647102b47f5f52dff6ef523f6989" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.462041 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4faf-account-create-update-xz29p" Feb 16 21:58:05 crc kubenswrapper[4777]: I0216 21:58:05.467812 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 16 21:58:06 crc kubenswrapper[4777]: I0216 21:58:06.347485 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q5qdr-config-q8w94"] Feb 16 21:58:06 crc kubenswrapper[4777]: I0216 21:58:06.360429 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q5qdr-config-q8w94"] Feb 16 21:58:06 crc kubenswrapper[4777]: I0216 21:58:06.393539 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a882b0c3-7f2e-446b-aea4-476cacffb112" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 16 21:58:07 crc kubenswrapper[4777]: I0216 21:58:07.484313 4777 generic.go:334] "Generic (PLEG): container finished" podID="865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" containerID="8c3c4e6df7d757d6543c535e83a20260f43671d7f30177c3e175f3c230eca68a" exitCode=0 Feb 16 21:58:07 crc kubenswrapper[4777]: I0216 21:58:07.484371 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fwhrq" event={"ID":"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c","Type":"ContainerDied","Data":"8c3c4e6df7d757d6543c535e83a20260f43671d7f30177c3e175f3c230eca68a"} Feb 16 21:58:08 crc kubenswrapper[4777]: I0216 21:58:08.198819 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e85b51-e3b8-4542-a9ba-3350e5abb69d" path="/var/lib/kubelet/pods/83e85b51-e3b8-4542-a9ba-3350e5abb69d/volumes" Feb 16 21:58:08 crc kubenswrapper[4777]: I0216 21:58:08.684892 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.101124 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fwhrq" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.238420 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data\") pod \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.238490 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle\") pod \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.238671 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data\") pod \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.238689 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xn8l\" (UniqueName: \"kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l\") pod \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\" (UID: \"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c\") " Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.246872 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l" (OuterVolumeSpecName: "kube-api-access-6xn8l") pod "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" (UID: "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c"). InnerVolumeSpecName "kube-api-access-6xn8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.247898 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" (UID: "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.270605 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" (UID: "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.288960 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data" (OuterVolumeSpecName: "config-data") pod "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" (UID: "865f0225-f1e8-4fb3-bc27-df7bfdb04f8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.341547 4777 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.341584 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.341593 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.341604 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xn8l\" (UniqueName: \"kubernetes.io/projected/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c-kube-api-access-6xn8l\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.503526 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h4sfw" event={"ID":"5dde5180-41b7-4ccd-b5bf-d144b205d163","Type":"ContainerStarted","Data":"82212abdbb1c19980082ee4ecdb242e576a429f845cee036d68c272abfb7d41a"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.505488 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fwhrq" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.505497 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fwhrq" event={"ID":"865f0225-f1e8-4fb3-bc27-df7bfdb04f8c","Type":"ContainerDied","Data":"b83eb4f8b3c7c00edc9ea4ee2717ca366e91c1b11d3d5025a5393c273b349887"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.505553 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b83eb4f8b3c7c00edc9ea4ee2717ca366e91c1b11d3d5025a5393c273b349887" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.520925 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"6f2427811607691cc5224485d6eca7261cd53708d1788d2e238dcb7a9104d67a"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.520964 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"b0ff62c168ab1559ef9c335189e22027f2a122ed3fe26e81d614b862bbc2ecba"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.520975 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"24d533a3acb8746acccc70f3c59b9ea7a04e6f2b771bb9291144dd7d72e35df4"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.520982 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"03b56c8330e167a68abf7800dd3dbf166b20fc51fff42db8603649ed57d585da"} Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.550275 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-h4sfw" podStartSLOduration=4.760484958 podStartE2EDuration="10.550258102s" podCreationTimestamp="2026-02-16 21:57:59 +0000 UTC" firstStartedPulling="2026-02-16 21:58:02.560591462 +0000 UTC m=+1203.143092564" lastFinishedPulling="2026-02-16 21:58:08.350364606 +0000 UTC m=+1208.932865708" observedRunningTime="2026-02-16 21:58:09.531271221 +0000 UTC m=+1210.113772323" watchObservedRunningTime="2026-02-16 21:58:09.550258102 +0000 UTC m=+1210.132759204" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.901804 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902150 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902161 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902173 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5063ed0-49f5-4947-af98-867246696986" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902179 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5063ed0-49f5-4947-af98-867246696986" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902189 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2488e5-f431-4464-b9f9-f549322948d1" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902196 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2488e5-f431-4464-b9f9-f549322948d1" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902209 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e859b095-dbc8-4bea-8d75-e576976dffaa" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902214 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e859b095-dbc8-4bea-8d75-e576976dffaa" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902228 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e85b51-e3b8-4542-a9ba-3350e5abb69d" containerName="ovn-config" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902233 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e85b51-e3b8-4542-a9ba-3350e5abb69d" containerName="ovn-config" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902245 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61160d7e-1d8e-47fa-94c6-02c654559bca" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902250 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="61160d7e-1d8e-47fa-94c6-02c654559bca" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902260 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c19c4b-cd3c-43b7-bb94-dba6086f3980" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902266 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c19c4b-cd3c-43b7-bb94-dba6086f3980" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902276 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b567a40e-3ae3-4d2d-be6c-25d0e1175711" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902281 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b567a40e-3ae3-4d2d-be6c-25d0e1175711" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902291 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" containerName="glance-db-sync" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902296 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" containerName="glance-db-sync" Feb 16 21:58:09 crc kubenswrapper[4777]: E0216 21:58:09.902309 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc41bf6-4400-47d8-bada-837e28d0d42c" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902315 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc41bf6-4400-47d8-bada-837e28d0d42c" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902474 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="61160d7e-1d8e-47fa-94c6-02c654559bca" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902489 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5063ed0-49f5-4947-af98-867246696986" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902501 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc41bf6-4400-47d8-bada-837e28d0d42c" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902513 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902524 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" containerName="glance-db-sync" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902530 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c19c4b-cd3c-43b7-bb94-dba6086f3980" containerName="mariadb-database-create" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902542 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2488e5-f431-4464-b9f9-f549322948d1" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902550 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e859b095-dbc8-4bea-8d75-e576976dffaa" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902555 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b567a40e-3ae3-4d2d-be6c-25d0e1175711" containerName="mariadb-account-create-update" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.902569 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e85b51-e3b8-4542-a9ba-3350e5abb69d" containerName="ovn-config" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.903463 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.920286 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.959620 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.959936 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.959984 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.960002 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxwp\" (UniqueName: \"kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:09 crc kubenswrapper[4777]: I0216 21:58:09.960026 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.061700 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.061770 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.061801 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.061829 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxwp\" (UniqueName: \"kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.061853 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.062754 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.062710 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.063419 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.063459 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.079574 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxwp\" (UniqueName: \"kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp\") pod \"dnsmasq-dns-5b946c75cc-x5pvr\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.267680 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.554962 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"927d6e1dafccb543ae3513c308fd377e95ef683a524be0e433d8dc2d1938f428"} Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.555203 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"075a6cb1b3a6fb96afa89078507a459b7361b641b341e08ffd4dff6d5f9aa480"} Feb 16 21:58:10 crc kubenswrapper[4777]: I0216 21:58:10.817863 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.567679 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3392d073-5de5-4f7e-ae87-e892f769157a","Type":"ContainerStarted","Data":"2194560d57c6cbde8dfd4f5e23c94ce81968e80d88c2d8dcaa49b33f3f5c67bb"} Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.569496 4777 generic.go:334] "Generic (PLEG): container finished" podID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerID="b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245" exitCode=0 Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.569535 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" event={"ID":"6f6d77d8-729a-43a9-a423-be6be2055f86","Type":"ContainerDied","Data":"b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245"} Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.569558 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" event={"ID":"6f6d77d8-729a-43a9-a423-be6be2055f86","Type":"ContainerStarted","Data":"c2c92841c7dd03a00da3f2834960bd60c4fccd96aed61eca4b4583a2845aa505"} Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.642889 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.08732733 podStartE2EDuration="48.64286895s" podCreationTimestamp="2026-02-16 21:57:23 +0000 UTC" firstStartedPulling="2026-02-16 21:57:57.781821472 +0000 UTC m=+1198.364322584" lastFinishedPulling="2026-02-16 21:58:08.337363082 +0000 UTC m=+1208.919864204" observedRunningTime="2026-02-16 21:58:11.638974691 +0000 UTC m=+1212.221475803" watchObservedRunningTime="2026-02-16 21:58:11.64286895 +0000 UTC m=+1212.225370052" Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.920771 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.954759 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.960349 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.966824 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 16 21:58:11 crc kubenswrapper[4777]: I0216 21:58:11.967083 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001643 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql6g\" (UniqueName: \"kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001682 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001717 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001790 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001839 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.001854 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.102979 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql6g\" (UniqueName: \"kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.103019 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.103049 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.103111 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.103156 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.103173 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.104164 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.104303 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.104459 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.104570 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.104742 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.123705 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql6g\" (UniqueName: \"kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g\") pod \"dnsmasq-dns-74f6bcbc87-6vmpp\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.276361 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.580919 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" event={"ID":"6f6d77d8-729a-43a9-a423-be6be2055f86","Type":"ContainerStarted","Data":"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39"} Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.581402 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.582320 4777 generic.go:334] "Generic (PLEG): container finished" podID="5dde5180-41b7-4ccd-b5bf-d144b205d163" containerID="82212abdbb1c19980082ee4ecdb242e576a429f845cee036d68c272abfb7d41a" exitCode=0 Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.582382 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h4sfw" event={"ID":"5dde5180-41b7-4ccd-b5bf-d144b205d163","Type":"ContainerDied","Data":"82212abdbb1c19980082ee4ecdb242e576a429f845cee036d68c272abfb7d41a"} Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.615994 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" podStartSLOduration=3.615981014 podStartE2EDuration="3.615981014s" podCreationTimestamp="2026-02-16 21:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:12.607463786 +0000 UTC m=+1213.189964888" watchObservedRunningTime="2026-02-16 21:58:12.615981014 +0000 UTC m=+1213.198482116" Feb 16 21:58:12 crc kubenswrapper[4777]: W0216 21:58:12.735377 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc22189e_82c6_4078_ac5b_ce27b2e891fd.slice/crio-29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888 WatchSource:0}: Error finding container 29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888: Status 404 returned error can't find the container with id 29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888 Feb 16 21:58:12 crc kubenswrapper[4777]: I0216 21:58:12.735953 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:13 crc kubenswrapper[4777]: I0216 21:58:13.592233 4777 generic.go:334] "Generic (PLEG): container finished" podID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerID="13f202891eddbe42db5953a9a4abdfb38766d69f3225c1265e219f95c90a09e4" exitCode=0 Feb 16 21:58:13 crc kubenswrapper[4777]: I0216 21:58:13.592861 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="dnsmasq-dns" containerID="cri-o://93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39" gracePeriod=10 Feb 16 21:58:13 crc kubenswrapper[4777]: I0216 21:58:13.592411 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" event={"ID":"cc22189e-82c6-4078-ac5b-ce27b2e891fd","Type":"ContainerDied","Data":"13f202891eddbe42db5953a9a4abdfb38766d69f3225c1265e219f95c90a09e4"} Feb 16 21:58:13 crc kubenswrapper[4777]: I0216 21:58:13.593164 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" event={"ID":"cc22189e-82c6-4078-ac5b-ce27b2e891fd","Type":"ContainerStarted","Data":"29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888"} Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.010992 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.039467 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5b86\" (UniqueName: \"kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86\") pod \"5dde5180-41b7-4ccd-b5bf-d144b205d163\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.039532 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle\") pod \"5dde5180-41b7-4ccd-b5bf-d144b205d163\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.039585 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data\") pod \"5dde5180-41b7-4ccd-b5bf-d144b205d163\" (UID: \"5dde5180-41b7-4ccd-b5bf-d144b205d163\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.044561 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86" (OuterVolumeSpecName: "kube-api-access-l5b86") pod "5dde5180-41b7-4ccd-b5bf-d144b205d163" (UID: "5dde5180-41b7-4ccd-b5bf-d144b205d163"). InnerVolumeSpecName "kube-api-access-l5b86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.067123 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dde5180-41b7-4ccd-b5bf-d144b205d163" (UID: "5dde5180-41b7-4ccd-b5bf-d144b205d163"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.078366 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.089032 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data" (OuterVolumeSpecName: "config-data") pod "5dde5180-41b7-4ccd-b5bf-d144b205d163" (UID: "5dde5180-41b7-4ccd-b5bf-d144b205d163"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.140855 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsxwp\" (UniqueName: \"kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp\") pod \"6f6d77d8-729a-43a9-a423-be6be2055f86\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.140934 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config\") pod \"6f6d77d8-729a-43a9-a423-be6be2055f86\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.140963 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb\") pod \"6f6d77d8-729a-43a9-a423-be6be2055f86\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.140998 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb\") pod \"6f6d77d8-729a-43a9-a423-be6be2055f86\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.141083 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc\") pod \"6f6d77d8-729a-43a9-a423-be6be2055f86\" (UID: \"6f6d77d8-729a-43a9-a423-be6be2055f86\") " Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.141450 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5b86\" (UniqueName: \"kubernetes.io/projected/5dde5180-41b7-4ccd-b5bf-d144b205d163-kube-api-access-l5b86\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.141471 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.141480 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dde5180-41b7-4ccd-b5bf-d144b205d163-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.144065 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp" (OuterVolumeSpecName: "kube-api-access-tsxwp") pod "6f6d77d8-729a-43a9-a423-be6be2055f86" (UID: "6f6d77d8-729a-43a9-a423-be6be2055f86"). InnerVolumeSpecName "kube-api-access-tsxwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.184744 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f6d77d8-729a-43a9-a423-be6be2055f86" (UID: "6f6d77d8-729a-43a9-a423-be6be2055f86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.184985 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f6d77d8-729a-43a9-a423-be6be2055f86" (UID: "6f6d77d8-729a-43a9-a423-be6be2055f86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.186896 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f6d77d8-729a-43a9-a423-be6be2055f86" (UID: "6f6d77d8-729a-43a9-a423-be6be2055f86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.188790 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config" (OuterVolumeSpecName: "config") pod "6f6d77d8-729a-43a9-a423-be6be2055f86" (UID: "6f6d77d8-729a-43a9-a423-be6be2055f86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.242795 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsxwp\" (UniqueName: \"kubernetes.io/projected/6f6d77d8-729a-43a9-a423-be6be2055f86-kube-api-access-tsxwp\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.242830 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.242840 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.242848 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.242858 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f6d77d8-729a-43a9-a423-be6be2055f86-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.611969 4777 generic.go:334] "Generic (PLEG): container finished" podID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerID="93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39" exitCode=0 Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.612078 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" event={"ID":"6f6d77d8-729a-43a9-a423-be6be2055f86","Type":"ContainerDied","Data":"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39"} Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.612122 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" event={"ID":"6f6d77d8-729a-43a9-a423-be6be2055f86","Type":"ContainerDied","Data":"c2c92841c7dd03a00da3f2834960bd60c4fccd96aed61eca4b4583a2845aa505"} Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.612152 4777 scope.go:117] "RemoveContainer" containerID="93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.612335 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-x5pvr" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.622126 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" event={"ID":"cc22189e-82c6-4078-ac5b-ce27b2e891fd","Type":"ContainerStarted","Data":"65228178cd903289646d3f1f31e24fc69e55721848afd119341bf3705d539e19"} Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.622242 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.625847 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h4sfw" event={"ID":"5dde5180-41b7-4ccd-b5bf-d144b205d163","Type":"ContainerDied","Data":"3e3782defba4411e31f1b5f7d5df5e9fb05b740107240e4f1cae8049c65ff34f"} Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.625883 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3782defba4411e31f1b5f7d5df5e9fb05b740107240e4f1cae8049c65ff34f" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.625946 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h4sfw" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.647830 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" podStartSLOduration=3.647809329 podStartE2EDuration="3.647809329s" podCreationTimestamp="2026-02-16 21:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:14.638764716 +0000 UTC m=+1215.221265858" watchObservedRunningTime="2026-02-16 21:58:14.647809329 +0000 UTC m=+1215.230310441" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.726034 4777 scope.go:117] "RemoveContainer" containerID="b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.752056 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.759494 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-x5pvr"] Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.764254 4777 scope.go:117] "RemoveContainer" containerID="93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39" Feb 16 21:58:14 crc kubenswrapper[4777]: E0216 21:58:14.764849 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39\": container with ID starting with 93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39 not found: ID does not exist" containerID="93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.764912 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39"} err="failed to get container status \"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39\": rpc error: code = NotFound desc = could not find container \"93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39\": container with ID starting with 93291aecd60aac187bc5c07b731ce6cdde02265f50b996276fa0fce6569b1f39 not found: ID does not exist" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.764975 4777 scope.go:117] "RemoveContainer" containerID="b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245" Feb 16 21:58:14 crc kubenswrapper[4777]: E0216 21:58:14.765339 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245\": container with ID starting with b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245 not found: ID does not exist" containerID="b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.765374 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245"} err="failed to get container status \"b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245\": rpc error: code = NotFound desc = could not find container \"b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245\": container with ID starting with b4a6079c50faf50e2354d6e019120f2ebb7c7862b0660bd6ded05cc480908245 not found: ID does not exist" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.922796 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.933663 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-csnks"] Feb 16 21:58:14 crc kubenswrapper[4777]: E0216 21:58:14.934432 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="dnsmasq-dns" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.934517 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="dnsmasq-dns" Feb 16 21:58:14 crc kubenswrapper[4777]: E0216 21:58:14.934597 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="init" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.934663 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="init" Feb 16 21:58:14 crc kubenswrapper[4777]: E0216 21:58:14.934797 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde5180-41b7-4ccd-b5bf-d144b205d163" containerName="keystone-db-sync" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.934871 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde5180-41b7-4ccd-b5bf-d144b205d163" containerName="keystone-db-sync" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.935154 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dde5180-41b7-4ccd-b5bf-d144b205d163" containerName="keystone-db-sync" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.935237 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" containerName="dnsmasq-dns" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.936177 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.941859 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.942106 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgg5d" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.942619 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.942749 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.943207 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.948683 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-csnks"] Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958074 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958147 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958178 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958201 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmlm\" (UniqueName: \"kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958224 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.958246 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.968734 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.971001 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:14 crc kubenswrapper[4777]: I0216 21:58:14.988955 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.062849 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063512 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063549 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmlm\" (UniqueName: \"kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063577 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063599 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063631 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063660 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063688 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkxx\" (UniqueName: \"kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063753 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063779 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063796 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.063823 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.069809 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.070176 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.070389 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.077942 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.078410 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.110966 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmlm\" (UniqueName: \"kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm\") pod \"keystone-bootstrap-csnks\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165552 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165607 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkxx\" (UniqueName: \"kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165663 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165681 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165708 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.165791 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.166504 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.167036 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.167752 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.168365 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.172380 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.190359 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkxx\" (UniqueName: \"kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx\") pod \"dnsmasq-dns-847c4cc679-chnz7\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.255010 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-z9tj6"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.256167 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.269563 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.270318 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nbmqh" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.270446 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.270899 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.278919 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-z9tj6"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.307897 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371698 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371822 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371871 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371903 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4k2\" (UniqueName: \"kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371933 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.371948 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.390803 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pl2w2"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.391930 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.405309 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-4x94c"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.410464 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.423763 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p4v8h"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.424947 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.427288 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.427604 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2xgvp" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.427835 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-h9m97" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.427995 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.428028 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.428136 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.432096 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.432225 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7v7qn" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.436063 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.463989 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pl2w2"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.469112 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476490 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476524 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476561 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476639 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476683 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.476711 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4k2\" (UniqueName: \"kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.477008 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.481386 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.494160 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.494532 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4x94c"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.528232 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.529997 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.534269 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4k2\" (UniqueName: \"kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.534317 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p4v8h"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.535891 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data\") pod \"cinder-db-sync-z9tj6\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581571 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581624 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjzn8\" (UniqueName: \"kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581650 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4g7t\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-kube-api-access-b4g7t\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581739 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581849 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcdh6\" (UniqueName: \"kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581871 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-certs\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581910 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-combined-ca-bundle\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581955 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-scripts\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.581978 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.582009 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-config-data\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.582037 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.604774 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.612380 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.625282 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pgthg"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.626479 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.628951 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.632190 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qg6qv" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.642209 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.683120 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.709028 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.711806 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.711920 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcdh6\" (UniqueName: \"kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.711954 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-certs\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.711990 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-combined-ca-bundle\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712023 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-scripts\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712052 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712078 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-config-data\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712108 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712172 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712195 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjzn8\" (UniqueName: \"kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712214 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4g7t\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-kube-api-access-b4g7t\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.712272 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.718736 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.729063 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.732902 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-scripts\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.736439 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-config-data\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.739919 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.740492 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.743811 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.752238 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f37d0cb-2453-4e2f-96de-de72db42d690-combined-ca-bundle\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.753083 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4g7t\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-kube-api-access-b4g7t\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.753493 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.761284 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/5f37d0cb-2453-4e2f-96de-de72db42d690-certs\") pod \"cloudkitty-db-sync-4x94c\" (UID: \"5f37d0cb-2453-4e2f-96de-de72db42d690\") " pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.764032 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjzn8\" (UniqueName: \"kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8\") pod \"neutron-db-sync-p4v8h\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.773067 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcdh6\" (UniqueName: \"kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6\") pod \"barbican-db-sync-pl2w2\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.808107 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817544 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8s8j\" (UniqueName: \"kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817615 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817638 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817656 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817774 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817908 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.817962 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.818012 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.818048 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c9h\" (UniqueName: \"kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.820847 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.821313 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.821344 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.833258 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pgthg"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.874818 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.882057 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.888283 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.895281 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.924634 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c9h\" (UniqueName: \"kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.924747 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.924817 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.924847 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.932070 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.933928 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8s8j\" (UniqueName: \"kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.938496 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.938568 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.938607 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.938754 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.943608 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.944537 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.945103 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.945988 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.946120 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.946241 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.947035 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.948376 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.955488 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.957293 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.963538 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.964294 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.964906 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c9h\" (UniqueName: \"kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h\") pod \"placement-db-sync-pgthg\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:15 crc kubenswrapper[4777]: I0216 21:58:15.974408 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8s8j\" (UniqueName: \"kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j\") pod \"ceilometer-0\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " pod="openstack/ceilometer-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.031202 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.048961 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.049069 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.049120 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.049157 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.049181 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq258\" (UniqueName: \"kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.049242 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.060581 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4x94c" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152550 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152626 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152674 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152800 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.152836 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq258\" (UniqueName: \"kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.153588 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.153771 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.153830 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.153927 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.154172 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.183179 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.192619 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq258\" (UniqueName: \"kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258\") pod \"dnsmasq-dns-785d8bcb8c-jg86v\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.209465 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6d77d8-729a-43a9-a423-be6be2055f86" path="/var/lib/kubelet/pods/6f6d77d8-729a-43a9-a423-be6be2055f86/volumes" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.210220 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.211666 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.215259 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.219247 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cbq8q" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.219443 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.219577 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.219767 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.225940 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.229162 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.232372 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.232587 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.245309 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.251684 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.255609 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.301635 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358327 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358403 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358426 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358445 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358468 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358494 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358525 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7w2\" (UniqueName: \"kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358557 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358571 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358591 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358610 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358632 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358655 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358674 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2pt7\" (UniqueName: \"kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358696 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.358748 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.402883 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.414685 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-csnks"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.428909 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-z9tj6"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.461882 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.461946 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.461990 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462009 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462042 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462068 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462089 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462123 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7w2\" (UniqueName: \"kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462155 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462171 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462191 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462209 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462234 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462256 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462280 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2pt7\" (UniqueName: \"kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.462304 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.465386 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.466283 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.466341 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.466622 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.467092 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.467119 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4a776c3602f9f8f9eee625c68304a3bb11e095b4c9719ede519a11464d274ec4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.467093 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.467179 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb51e62c662de87643413e62600db5d03a822462f1c1c4cd14c3bd046c1343d0/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.471292 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.473063 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.478420 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.479823 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7w2\" (UniqueName: \"kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.483386 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.485188 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.487589 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.489638 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.496878 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.504379 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2pt7\" (UniqueName: \"kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.533039 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.545207 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.564562 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.576596 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.678657 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p4v8h"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.732089 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z9tj6" event={"ID":"626f6429-977f-4c1f-b055-3502cb530645","Type":"ContainerStarted","Data":"7c703b74e9f817e86c6ec11e94c6e08afd9368aa409198183eadfb7184ef2566"} Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.746172 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p4v8h" event={"ID":"6681c372-76f5-4242-a533-0db4f1e711c8","Type":"ContainerStarted","Data":"68a79603bc9b9ea4ab1656e7102892acb25df42c7dd73acbbe25ac7b577b9275"} Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.765918 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csnks" event={"ID":"802c1f49-fcd4-4208-94e7-b065ccaaba10","Type":"ContainerStarted","Data":"9cb1c546ea7f270b7342b020a4f0a99e918a672d1843f59c57b60c8c3d05c761"} Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.790782 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" event={"ID":"e10fb401-c264-4717-b647-c63155d05bf2","Type":"ContainerStarted","Data":"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5"} Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.790934 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" event={"ID":"e10fb401-c264-4717-b647-c63155d05bf2","Type":"ContainerStarted","Data":"355dae2d81e7dea5c5d377fd851b8b15a52687bf2860d44aa1c96e568877c266"} Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.791003 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="dnsmasq-dns" containerID="cri-o://65228178cd903289646d3f1f31e24fc69e55721848afd119341bf3705d539e19" gracePeriod=10 Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.875909 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pl2w2"] Feb 16 21:58:16 crc kubenswrapper[4777]: I0216 21:58:16.884185 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4x94c"] Feb 16 21:58:17 crc kubenswrapper[4777]: E0216 21:58:17.099634 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:58:17 crc kubenswrapper[4777]: E0216 21:58:17.099936 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:58:17 crc kubenswrapper[4777]: E0216 21:58:17.100067 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 21:58:17 crc kubenswrapper[4777]: E0216 21:58:17.101783 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.350007 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pgthg"] Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.581635 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.603428 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.626211 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714456 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714496 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714532 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714645 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714768 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.714794 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkkxx\" (UniqueName: \"kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx\") pod \"e10fb401-c264-4717-b647-c63155d05bf2\" (UID: \"e10fb401-c264-4717-b647-c63155d05bf2\") " Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.788256 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx" (OuterVolumeSpecName: "kube-api-access-jkkxx") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "kube-api-access-jkkxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.798266 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.818397 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkkxx\" (UniqueName: \"kubernetes.io/projected/e10fb401-c264-4717-b647-c63155d05bf2-kube-api-access-jkkxx\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.818437 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.830782 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config" (OuterVolumeSpecName: "config") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.831115 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4x94c" event={"ID":"5f37d0cb-2453-4e2f-96de-de72db42d690","Type":"ContainerStarted","Data":"4cebfc37e99561ab3c276d273afcfa3979a25bdfce071fde92e92bb20e926622"} Feb 16 21:58:17 crc kubenswrapper[4777]: E0216 21:58:17.834198 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.834489 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl2w2" event={"ID":"4a4d7671-062a-4647-9a54-6933f7cc3a4d","Type":"ContainerStarted","Data":"8398bc254479e0c3eae0e2cad34da419d9bfc19c26e641f79d274367ca625bf0"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.851563 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.852020 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.867512 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pgthg" event={"ID":"332f9253-5c7a-4cc9-aead-9adc1fe86b2e","Type":"ContainerStarted","Data":"a16edc283f11001d268d46d49dd8056017f0754af4618977e086747b51fd6663"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.880569 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p4v8h" event={"ID":"6681c372-76f5-4242-a533-0db4f1e711c8","Type":"ContainerStarted","Data":"d0686f70a8a7c62232275422cf5ebd883e6d555c19b952c8f5f33f9dfab3e888"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.925844 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.926147 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.926161 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.926908 4777 generic.go:334] "Generic (PLEG): container finished" podID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerID="65228178cd903289646d3f1f31e24fc69e55721848afd119341bf3705d539e19" exitCode=0 Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.926976 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" event={"ID":"cc22189e-82c6-4078-ac5b-ce27b2e891fd","Type":"ContainerDied","Data":"65228178cd903289646d3f1f31e24fc69e55721848afd119341bf3705d539e19"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.927000 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" event={"ID":"cc22189e-82c6-4078-ac5b-ce27b2e891fd","Type":"ContainerDied","Data":"29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.927011 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29cdb5b37e4798fc6301318264e2bc4c9448507daf167023869ca84d0bfea888" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.973094 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csnks" event={"ID":"802c1f49-fcd4-4208-94e7-b065ccaaba10","Type":"ContainerStarted","Data":"8235b8d1d9b22da04310169790fa837ce57f1ffee37a8026b1f976e0f1a0cb0c"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.977087 4777 generic.go:334] "Generic (PLEG): container finished" podID="e10fb401-c264-4717-b647-c63155d05bf2" containerID="16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5" exitCode=0 Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.977140 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" event={"ID":"e10fb401-c264-4717-b647-c63155d05bf2","Type":"ContainerDied","Data":"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.977163 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" event={"ID":"e10fb401-c264-4717-b647-c63155d05bf2","Type":"ContainerDied","Data":"355dae2d81e7dea5c5d377fd851b8b15a52687bf2860d44aa1c96e568877c266"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.977179 4777 scope.go:117] "RemoveContainer" containerID="16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.977273 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-chnz7" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.986349 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" event={"ID":"ac680854-48f9-420b-8174-1a1766732f1a","Type":"ContainerStarted","Data":"0f94f632a2c520daacc27f798100652f07d19803dd243eeaaea10910304f6078"} Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.993015 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e10fb401-c264-4717-b647-c63155d05bf2" (UID: "e10fb401-c264-4717-b647-c63155d05bf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:17 crc kubenswrapper[4777]: I0216 21:58:17.998102 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p4v8h" podStartSLOduration=2.998085692 podStartE2EDuration="2.998085692s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:17.929606014 +0000 UTC m=+1218.512107116" watchObservedRunningTime="2026-02-16 21:58:17.998085692 +0000 UTC m=+1218.580586794" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.034050 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e10fb401-c264-4717-b647-c63155d05bf2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.034698 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-csnks" podStartSLOduration=4.034687677 podStartE2EDuration="4.034687677s" podCreationTimestamp="2026-02-16 21:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:18.002078854 +0000 UTC m=+1218.584579956" watchObservedRunningTime="2026-02-16 21:58:18.034687677 +0000 UTC m=+1218.617188779" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.049437 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.159147 4777 scope.go:117] "RemoveContainer" containerID="16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.159382 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:18 crc kubenswrapper[4777]: E0216 21:58:18.160079 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5\": container with ID starting with 16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5 not found: ID does not exist" containerID="16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.160121 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5"} err="failed to get container status \"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5\": rpc error: code = NotFound desc = could not find container \"16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5\": container with ID starting with 16dae6e7786c247cd9a9d5f66e36b68f0eec22149d474b3dab4651d7c56cc9b5 not found: ID does not exist" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247001 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247082 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247132 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247254 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247330 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ql6g\" (UniqueName: \"kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.247508 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb\") pod \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\" (UID: \"cc22189e-82c6-4078-ac5b-ce27b2e891fd\") " Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.273438 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g" (OuterVolumeSpecName: "kube-api-access-5ql6g") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "kube-api-access-5ql6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.329802 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.334295 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.349197 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.350476 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.350495 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.350506 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ql6g\" (UniqueName: \"kubernetes.io/projected/cc22189e-82c6-4078-ac5b-ce27b2e891fd-kube-api-access-5ql6g\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.357935 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-chnz7"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.359225 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.360704 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.398203 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config" (OuterVolumeSpecName: "config") pod "cc22189e-82c6-4078-ac5b-ce27b2e891fd" (UID: "cc22189e-82c6-4078-ac5b-ce27b2e891fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.453738 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.453769 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.453778 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc22189e-82c6-4078-ac5b-ce27b2e891fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.508189 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.531583 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.610730 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:18 crc kubenswrapper[4777]: I0216 21:58:18.796434 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.010768 4777 generic.go:334] "Generic (PLEG): container finished" podID="ac680854-48f9-420b-8174-1a1766732f1a" containerID="b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2" exitCode=0 Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.010842 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" event={"ID":"ac680854-48f9-420b-8174-1a1766732f1a","Type":"ContainerDied","Data":"b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2"} Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.022134 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerStarted","Data":"453fce8f4b05bf10c4eedaae590d937b40e45e5609e102db6ff9eae1ef01a333"} Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.054048 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerStarted","Data":"edb15ff0ccb69174e37b5c4e9ff1e65dc74364c54537f2661588b5f40ebb10f4"} Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.062926 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerStarted","Data":"ae0e13926a06ceb2a57c6c974466ffac3088ae4a1dd1019610e671c739c92737"} Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.076140 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6vmpp" Feb 16 21:58:19 crc kubenswrapper[4777]: E0216 21:58:19.097781 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.177241 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:19 crc kubenswrapper[4777]: I0216 21:58:19.191357 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6vmpp"] Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.091836 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerStarted","Data":"a2724b813226c186fc4b5a5e1ef36486117c3469645023e2ad8a4eb2cdf173af"} Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.095405 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerStarted","Data":"b6fe8cf11222ba136db083b8853acdf1dab43bbb9d82861f49c1f92792d96fc4"} Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.099813 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" event={"ID":"ac680854-48f9-420b-8174-1a1766732f1a","Type":"ContainerStarted","Data":"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985"} Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.100083 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.136219 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" podStartSLOduration=5.136198724 podStartE2EDuration="5.136198724s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:20.126586454 +0000 UTC m=+1220.709087546" watchObservedRunningTime="2026-02-16 21:58:20.136198724 +0000 UTC m=+1220.718699826" Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.217169 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" path="/var/lib/kubelet/pods/cc22189e-82c6-4078-ac5b-ce27b2e891fd/volumes" Feb 16 21:58:20 crc kubenswrapper[4777]: I0216 21:58:20.218268 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10fb401-c264-4717-b647-c63155d05bf2" path="/var/lib/kubelet/pods/e10fb401-c264-4717-b647-c63155d05bf2/volumes" Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.110471 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerStarted","Data":"0d730477226bc83c6f51b8ead5d7541621d20fe7d2f91664f9848fcd718c3f15"} Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.110586 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-log" containerID="cri-o://b6fe8cf11222ba136db083b8853acdf1dab43bbb9d82861f49c1f92792d96fc4" gracePeriod=30 Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.110771 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-httpd" containerID="cri-o://0d730477226bc83c6f51b8ead5d7541621d20fe7d2f91664f9848fcd718c3f15" gracePeriod=30 Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.126290 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-log" containerID="cri-o://a2724b813226c186fc4b5a5e1ef36486117c3469645023e2ad8a4eb2cdf173af" gracePeriod=30 Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.126550 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerStarted","Data":"3921ea88b637d72a568bd5d11b29c76e3f28ea72048d4109f8025384b96b53b6"} Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.126643 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-httpd" containerID="cri-o://3921ea88b637d72a568bd5d11b29c76e3f28ea72048d4109f8025384b96b53b6" gracePeriod=30 Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.151401 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.151382556 podStartE2EDuration="6.151382556s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:21.130170132 +0000 UTC m=+1221.712671234" watchObservedRunningTime="2026-02-16 21:58:21.151382556 +0000 UTC m=+1221.733883658" Feb 16 21:58:21 crc kubenswrapper[4777]: I0216 21:58:21.166648 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.166630683 podStartE2EDuration="6.166630683s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:21.156079588 +0000 UTC m=+1221.738580680" watchObservedRunningTime="2026-02-16 21:58:21.166630683 +0000 UTC m=+1221.749131785" Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.165438 4777 generic.go:334] "Generic (PLEG): container finished" podID="802c1f49-fcd4-4208-94e7-b065ccaaba10" containerID="8235b8d1d9b22da04310169790fa837ce57f1ffee37a8026b1f976e0f1a0cb0c" exitCode=0 Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.165872 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csnks" event={"ID":"802c1f49-fcd4-4208-94e7-b065ccaaba10","Type":"ContainerDied","Data":"8235b8d1d9b22da04310169790fa837ce57f1ffee37a8026b1f976e0f1a0cb0c"} Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.172933 4777 generic.go:334] "Generic (PLEG): container finished" podID="164a2b79-be6e-4867-a67a-e5611b831d38" containerID="0d730477226bc83c6f51b8ead5d7541621d20fe7d2f91664f9848fcd718c3f15" exitCode=0 Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.172965 4777 generic.go:334] "Generic (PLEG): container finished" podID="164a2b79-be6e-4867-a67a-e5611b831d38" containerID="b6fe8cf11222ba136db083b8853acdf1dab43bbb9d82861f49c1f92792d96fc4" exitCode=143 Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.173008 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerDied","Data":"0d730477226bc83c6f51b8ead5d7541621d20fe7d2f91664f9848fcd718c3f15"} Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.173048 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerDied","Data":"b6fe8cf11222ba136db083b8853acdf1dab43bbb9d82861f49c1f92792d96fc4"} Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.187911 4777 generic.go:334] "Generic (PLEG): container finished" podID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerID="3921ea88b637d72a568bd5d11b29c76e3f28ea72048d4109f8025384b96b53b6" exitCode=0 Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.187942 4777 generic.go:334] "Generic (PLEG): container finished" podID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerID="a2724b813226c186fc4b5a5e1ef36486117c3469645023e2ad8a4eb2cdf173af" exitCode=143 Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.213705 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerDied","Data":"3921ea88b637d72a568bd5d11b29c76e3f28ea72048d4109f8025384b96b53b6"} Feb 16 21:58:22 crc kubenswrapper[4777]: I0216 21:58:22.213762 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerDied","Data":"a2724b813226c186fc4b5a5e1ef36486117c3469645023e2ad8a4eb2cdf173af"} Feb 16 21:58:26 crc kubenswrapper[4777]: I0216 21:58:26.257675 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:26 crc kubenswrapper[4777]: I0216 21:58:26.323365 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:58:26 crc kubenswrapper[4777]: I0216 21:58:26.323584 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-rx9j2" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="dnsmasq-dns" containerID="cri-o://e0381be60a85348ea300843122315b258d68348370597d9404770c1c589231b2" gracePeriod=10 Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.244409 4777 generic.go:334] "Generic (PLEG): container finished" podID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerID="e0381be60a85348ea300843122315b258d68348370597d9404770c1c589231b2" exitCode=0 Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.244471 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx9j2" event={"ID":"9043c023-eae7-4cba-bcfe-3e44fa168c38","Type":"ContainerDied","Data":"e0381be60a85348ea300843122315b258d68348370597d9404770c1c589231b2"} Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.677613 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791124 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791345 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791388 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791459 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791525 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791582 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2pt7\" (UniqueName: \"kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791642 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.791837 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\" (UID: \"724fcbf8-c3b0-4d69-bf86-f2691e7a8947\") " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.797417 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.797669 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs" (OuterVolumeSpecName: "logs") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.799190 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts" (OuterVolumeSpecName: "scripts") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.814304 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7" (OuterVolumeSpecName: "kube-api-access-q2pt7") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "kube-api-access-q2pt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.838814 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2" (OuterVolumeSpecName: "glance") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.847629 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.869701 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data" (OuterVolumeSpecName: "config-data") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.875892 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "724fcbf8-c3b0-4d69-bf86-f2691e7a8947" (UID: "724fcbf8-c3b0-4d69-bf86-f2691e7a8947"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.893874 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.893942 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.893956 4777 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.893980 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.893992 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2pt7\" (UniqueName: \"kubernetes.io/projected/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-kube-api-access-q2pt7\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.894003 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.894047 4777 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") on node \"crc\" " Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.894062 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fcbf8-c3b0-4d69-bf86-f2691e7a8947-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.954314 4777 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.954478 4777 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2") on node "crc" Feb 16 21:58:27 crc kubenswrapper[4777]: I0216 21:58:27.996583 4777 reconciler_common.go:293] "Volume detached for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.258687 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"724fcbf8-c3b0-4d69-bf86-f2691e7a8947","Type":"ContainerDied","Data":"edb15ff0ccb69174e37b5c4e9ff1e65dc74364c54537f2661588b5f40ebb10f4"} Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.258785 4777 scope.go:117] "RemoveContainer" containerID="3921ea88b637d72a568bd5d11b29c76e3f28ea72048d4109f8025384b96b53b6" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.259097 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.290948 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.297753 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.319906 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:28 crc kubenswrapper[4777]: E0216 21:58:28.320353 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10fb401-c264-4717-b647-c63155d05bf2" containerName="init" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320372 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10fb401-c264-4717-b647-c63155d05bf2" containerName="init" Feb 16 21:58:28 crc kubenswrapper[4777]: E0216 21:58:28.320403 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-log" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320412 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-log" Feb 16 21:58:28 crc kubenswrapper[4777]: E0216 21:58:28.320428 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="dnsmasq-dns" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320437 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="dnsmasq-dns" Feb 16 21:58:28 crc kubenswrapper[4777]: E0216 21:58:28.320453 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-httpd" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320463 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-httpd" Feb 16 21:58:28 crc kubenswrapper[4777]: E0216 21:58:28.320473 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="init" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320481 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="init" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320683 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-log" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320703 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10fb401-c264-4717-b647-c63155d05bf2" containerName="init" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320730 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" containerName="glance-httpd" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.320749 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc22189e-82c6-4078-ac5b-ce27b2e891fd" containerName="dnsmasq-dns" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.321941 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.324162 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.324210 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509220 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vpv\" (UniqueName: \"kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509268 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509300 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509488 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509581 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509646 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509679 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.509846 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.611955 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vpv\" (UniqueName: \"kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612082 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612146 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612174 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612202 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612227 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612252 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612348 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.612862 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.613095 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.617604 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.617955 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.619614 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.632367 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.632419 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb51e62c662de87643413e62600db5d03a822462f1c1c4cd14c3bd046c1343d0/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.634829 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.637221 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vpv\" (UniqueName: \"kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.671936 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.950136 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:28 crc kubenswrapper[4777]: I0216 21:58:28.975243 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.196660 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724fcbf8-c3b0-4d69-bf86-f2691e7a8947" path="/var/lib/kubelet/pods/724fcbf8-c3b0-4d69-bf86-f2691e7a8947/volumes" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.447389 4777 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbbae3fdc-fa7e-41bc-8c73-8d126b476ba5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbbae3fdc-fa7e-41bc-8c73-8d126b476ba5] : Timed out while waiting for systemd to remove kubepods-besteffort-podbbae3fdc_fa7e_41bc_8c73_8d126b476ba5.slice" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.631997 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.754459 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb\") pod \"9043c023-eae7-4cba-bcfe-3e44fa168c38\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.754636 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb\") pod \"9043c023-eae7-4cba-bcfe-3e44fa168c38\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.754829 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdg2p\" (UniqueName: \"kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p\") pod \"9043c023-eae7-4cba-bcfe-3e44fa168c38\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.755608 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config\") pod \"9043c023-eae7-4cba-bcfe-3e44fa168c38\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.755657 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc\") pod \"9043c023-eae7-4cba-bcfe-3e44fa168c38\" (UID: \"9043c023-eae7-4cba-bcfe-3e44fa168c38\") " Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.760868 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p" (OuterVolumeSpecName: "kube-api-access-pdg2p") pod "9043c023-eae7-4cba-bcfe-3e44fa168c38" (UID: "9043c023-eae7-4cba-bcfe-3e44fa168c38"). InnerVolumeSpecName "kube-api-access-pdg2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.809143 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9043c023-eae7-4cba-bcfe-3e44fa168c38" (UID: "9043c023-eae7-4cba-bcfe-3e44fa168c38"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.813699 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config" (OuterVolumeSpecName: "config") pod "9043c023-eae7-4cba-bcfe-3e44fa168c38" (UID: "9043c023-eae7-4cba-bcfe-3e44fa168c38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.816843 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9043c023-eae7-4cba-bcfe-3e44fa168c38" (UID: "9043c023-eae7-4cba-bcfe-3e44fa168c38"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.817740 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9043c023-eae7-4cba-bcfe-3e44fa168c38" (UID: "9043c023-eae7-4cba-bcfe-3e44fa168c38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.859559 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.859589 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.859604 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdg2p\" (UniqueName: \"kubernetes.io/projected/9043c023-eae7-4cba-bcfe-3e44fa168c38-kube-api-access-pdg2p\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.859618 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.859630 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9043c023-eae7-4cba-bcfe-3e44fa168c38-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:30 crc kubenswrapper[4777]: I0216 21:58:30.996443 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.006593 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.164890 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.164972 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct7w2\" (UniqueName: \"kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165039 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165081 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165119 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165240 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165284 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165435 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165456 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165496 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165520 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xmlm\" (UniqueName: \"kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165576 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165624 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle\") pod \"802c1f49-fcd4-4208-94e7-b065ccaaba10\" (UID: \"802c1f49-fcd4-4208-94e7-b065ccaaba10\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.165650 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle\") pod \"164a2b79-be6e-4867-a67a-e5611b831d38\" (UID: \"164a2b79-be6e-4867-a67a-e5611b831d38\") " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.166285 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.166928 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs" (OuterVolumeSpecName: "logs") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.168756 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm" (OuterVolumeSpecName: "kube-api-access-6xmlm") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "kube-api-access-6xmlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.170432 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.171368 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts" (OuterVolumeSpecName: "scripts") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.172625 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.173404 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts" (OuterVolumeSpecName: "scripts") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.175861 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2" (OuterVolumeSpecName: "kube-api-access-ct7w2") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "kube-api-access-ct7w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.182279 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4" (OuterVolumeSpecName: "glance") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "pvc-94c2b3e1-91fc-479b-9144-fb21290011a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.194321 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.246936 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data" (OuterVolumeSpecName: "config-data") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.257312 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "802c1f49-fcd4-4208-94e7-b065ccaaba10" (UID: "802c1f49-fcd4-4208-94e7-b065ccaaba10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.266363 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267426 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267448 4777 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267460 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xmlm\" (UniqueName: \"kubernetes.io/projected/802c1f49-fcd4-4208-94e7-b065ccaaba10-kube-api-access-6xmlm\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267470 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/164a2b79-be6e-4867-a67a-e5611b831d38-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267478 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267486 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267494 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267502 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct7w2\" (UniqueName: \"kubernetes.io/projected/164a2b79-be6e-4867-a67a-e5611b831d38-kube-api-access-ct7w2\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267510 4777 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267519 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267527 4777 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267548 4777 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") on node \"crc\" " Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.267558 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802c1f49-fcd4-4208-94e7-b065ccaaba10-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.270885 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data" (OuterVolumeSpecName: "config-data") pod "164a2b79-be6e-4867-a67a-e5611b831d38" (UID: "164a2b79-be6e-4867-a67a-e5611b831d38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.287393 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-csnks" event={"ID":"802c1f49-fcd4-4208-94e7-b065ccaaba10","Type":"ContainerDied","Data":"9cb1c546ea7f270b7342b020a4f0a99e918a672d1843f59c57b60c8c3d05c761"} Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.287450 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb1c546ea7f270b7342b020a4f0a99e918a672d1843f59c57b60c8c3d05c761" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.287417 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-csnks" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.289366 4777 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.289454 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"164a2b79-be6e-4867-a67a-e5611b831d38","Type":"ContainerDied","Data":"ae0e13926a06ceb2a57c6c974466ffac3088ae4a1dd1019610e671c739c92737"} Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.289531 4777 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-94c2b3e1-91fc-479b-9144-fb21290011a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4") on node "crc" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.289536 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.296948 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rx9j2" event={"ID":"9043c023-eae7-4cba-bcfe-3e44fa168c38","Type":"ContainerDied","Data":"69bc2cd7615cd7d33ad59e46294e3cb0e47554eabfdf20c4f1103ec26f163321"} Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.297059 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rx9j2" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.339382 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.365017 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.369354 4777 reconciler_common.go:293] "Volume detached for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.369392 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/164a2b79-be6e-4867-a67a-e5611b831d38-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.394929 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:31 crc kubenswrapper[4777]: E0216 21:58:31.395584 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="dnsmasq-dns" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395601 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="dnsmasq-dns" Feb 16 21:58:31 crc kubenswrapper[4777]: E0216 21:58:31.395632 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-log" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395640 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-log" Feb 16 21:58:31 crc kubenswrapper[4777]: E0216 21:58:31.395657 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802c1f49-fcd4-4208-94e7-b065ccaaba10" containerName="keystone-bootstrap" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395666 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="802c1f49-fcd4-4208-94e7-b065ccaaba10" containerName="keystone-bootstrap" Feb 16 21:58:31 crc kubenswrapper[4777]: E0216 21:58:31.395682 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="init" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395690 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="init" Feb 16 21:58:31 crc kubenswrapper[4777]: E0216 21:58:31.395739 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-httpd" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395748 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-httpd" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395943 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="dnsmasq-dns" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395966 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-log" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395975 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" containerName="glance-httpd" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.395983 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="802c1f49-fcd4-4208-94e7-b065ccaaba10" containerName="keystone-bootstrap" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.397269 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.400090 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.400594 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.406058 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.417222 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rx9j2"] Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.424958 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572659 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572705 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572742 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572784 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572831 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99zs\" (UniqueName: \"kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572853 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572928 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.572970 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.674975 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675463 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675514 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675541 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675559 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675596 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675640 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99zs\" (UniqueName: \"kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.675664 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.678426 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.679324 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.679937 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.680092 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.680127 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4a776c3602f9f8f9eee625c68304a3bb11e095b4c9719ede519a11464d274ec4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.680282 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.681418 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.682393 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.692774 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99zs\" (UniqueName: \"kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:31 crc kubenswrapper[4777]: I0216 21:58:31.724970 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " pod="openstack/glance-default-external-api-0" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.022093 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.094804 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-csnks"] Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.104313 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-csnks"] Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.210058 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164a2b79-be6e-4867-a67a-e5611b831d38" path="/var/lib/kubelet/pods/164a2b79-be6e-4867-a67a-e5611b831d38/volumes" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.211613 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802c1f49-fcd4-4208-94e7-b065ccaaba10" path="/var/lib/kubelet/pods/802c1f49-fcd4-4208-94e7-b065ccaaba10/volumes" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.212466 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" path="/var/lib/kubelet/pods/9043c023-eae7-4cba-bcfe-3e44fa168c38/volumes" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.214061 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-njzfj"] Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.215428 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.217582 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.217870 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.218411 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgg5d" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.218491 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.236771 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-njzfj"] Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387049 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387163 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387260 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387328 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drfs\" (UniqueName: \"kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387377 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.387469 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488773 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drfs\" (UniqueName: \"kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488797 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488840 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488883 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.488923 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.494036 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.494596 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.495288 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.498253 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.498444 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.505124 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drfs\" (UniqueName: \"kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs\") pod \"keystone-bootstrap-njzfj\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:32 crc kubenswrapper[4777]: I0216 21:58:32.545400 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:34 crc kubenswrapper[4777]: I0216 21:58:34.170633 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rx9j2" podUID="9043c023-eae7-4cba-bcfe-3e44fa168c38" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Feb 16 21:58:34 crc kubenswrapper[4777]: E0216 21:58:34.340247 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:58:34 crc kubenswrapper[4777]: E0216 21:58:34.340315 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:58:34 crc kubenswrapper[4777]: E0216 21:58:34.340475 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 21:58:34 crc kubenswrapper[4777]: E0216 21:58:34.341812 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:58:40 crc kubenswrapper[4777]: E0216 21:58:40.589992 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 16 21:58:40 crc kubenswrapper[4777]: E0216 21:58:40.591076 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fcdh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-pl2w2_openstack(4a4d7671-062a-4647-9a54-6933f7cc3a4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:58:40 crc kubenswrapper[4777]: E0216 21:58:40.592343 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-pl2w2" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.393024 4777 generic.go:334] "Generic (PLEG): container finished" podID="6681c372-76f5-4242-a533-0db4f1e711c8" containerID="d0686f70a8a7c62232275422cf5ebd883e6d555c19b952c8f5f33f9dfab3e888" exitCode=0 Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.393855 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p4v8h" event={"ID":"6681c372-76f5-4242-a533-0db4f1e711c8","Type":"ContainerDied","Data":"d0686f70a8a7c62232275422cf5ebd883e6d555c19b952c8f5f33f9dfab3e888"} Feb 16 21:58:41 crc kubenswrapper[4777]: E0216 21:58:41.395300 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-pl2w2" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.670797 4777 scope.go:117] "RemoveContainer" containerID="a2724b813226c186fc4b5a5e1ef36486117c3469645023e2ad8a4eb2cdf173af" Feb 16 21:58:41 crc kubenswrapper[4777]: E0216 21:58:41.707172 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 16 21:58:41 crc kubenswrapper[4777]: E0216 21:58:41.707725 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zg4k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-z9tj6_openstack(626f6429-977f-4c1f-b055-3502cb530645): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 16 21:58:41 crc kubenswrapper[4777]: E0216 21:58:41.709065 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-z9tj6" podUID="626f6429-977f-4c1f-b055-3502cb530645" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.733851 4777 scope.go:117] "RemoveContainer" containerID="0d730477226bc83c6f51b8ead5d7541621d20fe7d2f91664f9848fcd718c3f15" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.925335 4777 scope.go:117] "RemoveContainer" containerID="b6fe8cf11222ba136db083b8853acdf1dab43bbb9d82861f49c1f92792d96fc4" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.962081 4777 scope.go:117] "RemoveContainer" containerID="e0381be60a85348ea300843122315b258d68348370597d9404770c1c589231b2" Feb 16 21:58:41 crc kubenswrapper[4777]: I0216 21:58:41.988213 4777 scope.go:117] "RemoveContainer" containerID="59c4a9175f1127800a7e66bfb7efbac3333819929c7c5f613beb21b8b3d5c524" Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.300195 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.397448 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-njzfj"] Feb 16 21:58:42 crc kubenswrapper[4777]: W0216 21:58:42.402034 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7010b1ce_fc40_4fc8_ae63_53368dfd55f9.slice/crio-f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf WatchSource:0}: Error finding container f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf: Status 404 returned error can't find the container with id f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.406627 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pgthg" event={"ID":"332f9253-5c7a-4cc9-aead-9adc1fe86b2e","Type":"ContainerStarted","Data":"eca6c7b5b9ce4f355f27e7fa3c6f275807e1bbaf26835460fab1390227518218"} Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.415516 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerStarted","Data":"bc1b5d56e9c3889157af4be45f9afef1946656464f94a369f0548730a23191c5"} Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.426829 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerStarted","Data":"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f"} Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.430090 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pgthg" podStartSLOduration=4.23772605 podStartE2EDuration="27.43006675s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="2026-02-16 21:58:17.346280236 +0000 UTC m=+1217.928781338" lastFinishedPulling="2026-02-16 21:58:40.538620926 +0000 UTC m=+1241.121122038" observedRunningTime="2026-02-16 21:58:42.422856298 +0000 UTC m=+1243.005357410" watchObservedRunningTime="2026-02-16 21:58:42.43006675 +0000 UTC m=+1243.012567852" Feb 16 21:58:42 crc kubenswrapper[4777]: E0216 21:58:42.430390 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-z9tj6" podUID="626f6429-977f-4c1f-b055-3502cb530645" Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.797734 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.923586 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjzn8\" (UniqueName: \"kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8\") pod \"6681c372-76f5-4242-a533-0db4f1e711c8\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.923737 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle\") pod \"6681c372-76f5-4242-a533-0db4f1e711c8\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.923825 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config\") pod \"6681c372-76f5-4242-a533-0db4f1e711c8\" (UID: \"6681c372-76f5-4242-a533-0db4f1e711c8\") " Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.929199 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8" (OuterVolumeSpecName: "kube-api-access-wjzn8") pod "6681c372-76f5-4242-a533-0db4f1e711c8" (UID: "6681c372-76f5-4242-a533-0db4f1e711c8"). InnerVolumeSpecName "kube-api-access-wjzn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.949309 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config" (OuterVolumeSpecName: "config") pod "6681c372-76f5-4242-a533-0db4f1e711c8" (UID: "6681c372-76f5-4242-a533-0db4f1e711c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:42 crc kubenswrapper[4777]: I0216 21:58:42.954970 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6681c372-76f5-4242-a533-0db4f1e711c8" (UID: "6681c372-76f5-4242-a533-0db4f1e711c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.025897 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjzn8\" (UniqueName: \"kubernetes.io/projected/6681c372-76f5-4242-a533-0db4f1e711c8-kube-api-access-wjzn8\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.025931 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.025942 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6681c372-76f5-4242-a533-0db4f1e711c8-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.418831 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:58:43 crc kubenswrapper[4777]: W0216 21:58:43.419760 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a9dba4_89bb_4b87_a1cf_3f031befd745.slice/crio-de505b4cdf02bb85410143424239e3b0d421ed90d5bb5f7d240993a4586e9214 WatchSource:0}: Error finding container de505b4cdf02bb85410143424239e3b0d421ed90d5bb5f7d240993a4586e9214: Status 404 returned error can't find the container with id de505b4cdf02bb85410143424239e3b0d421ed90d5bb5f7d240993a4586e9214 Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.456385 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerStarted","Data":"de505b4cdf02bb85410143424239e3b0d421ed90d5bb5f7d240993a4586e9214"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.458146 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerStarted","Data":"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.461023 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p4v8h" event={"ID":"6681c372-76f5-4242-a533-0db4f1e711c8","Type":"ContainerDied","Data":"68a79603bc9b9ea4ab1656e7102892acb25df42c7dd73acbbe25ac7b577b9275"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.461067 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a79603bc9b9ea4ab1656e7102892acb25df42c7dd73acbbe25ac7b577b9275" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.461122 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p4v8h" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.467075 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njzfj" event={"ID":"7010b1ce-fc40-4fc8-ae63-53368dfd55f9","Type":"ContainerStarted","Data":"6ced8a5ba17e392ecffdea1ece914209ced3a576c5624f5039c6f2cb8bb093bf"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.467126 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njzfj" event={"ID":"7010b1ce-fc40-4fc8-ae63-53368dfd55f9","Type":"ContainerStarted","Data":"f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.472328 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerStarted","Data":"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f"} Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.485296 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-njzfj" podStartSLOduration=11.485275783 podStartE2EDuration="11.485275783s" podCreationTimestamp="2026-02-16 21:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:43.484378408 +0000 UTC m=+1244.066879510" watchObservedRunningTime="2026-02-16 21:58:43.485275783 +0000 UTC m=+1244.067776885" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.638575 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:58:43 crc kubenswrapper[4777]: E0216 21:58:43.653953 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6681c372-76f5-4242-a533-0db4f1e711c8" containerName="neutron-db-sync" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.653989 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="6681c372-76f5-4242-a533-0db4f1e711c8" containerName="neutron-db-sync" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.654248 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="6681c372-76f5-4242-a533-0db4f1e711c8" containerName="neutron-db-sync" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.655202 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.664897 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.733190 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.739511 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.754904 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7v7qn" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.755057 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.755155 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.755246 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.755955 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756003 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756050 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756081 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7rdr\" (UniqueName: \"kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756115 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756145 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756165 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756186 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756214 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgcn\" (UniqueName: \"kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756233 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.756251 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.764545 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.857953 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858039 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858088 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7rdr\" (UniqueName: \"kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858123 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858159 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858183 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858208 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858247 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgcn\" (UniqueName: \"kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858267 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858291 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858322 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.858915 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.860237 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.860518 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.860995 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.861147 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.863426 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.898539 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.899070 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.899500 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.902968 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7rdr\" (UniqueName: \"kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr\") pod \"neutron-6f8f666f74-96kzt\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.906258 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgcn\" (UniqueName: \"kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn\") pod \"dnsmasq-dns-55f844cf75-z7mrk\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:43 crc kubenswrapper[4777]: I0216 21:58:43.982914 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.078730 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.492397 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerStarted","Data":"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858"} Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.494389 4777 generic.go:334] "Generic (PLEG): container finished" podID="332f9253-5c7a-4cc9-aead-9adc1fe86b2e" containerID="eca6c7b5b9ce4f355f27e7fa3c6f275807e1bbaf26835460fab1390227518218" exitCode=0 Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.494458 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pgthg" event={"ID":"332f9253-5c7a-4cc9-aead-9adc1fe86b2e","Type":"ContainerDied","Data":"eca6c7b5b9ce4f355f27e7fa3c6f275807e1bbaf26835460fab1390227518218"} Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.497443 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerStarted","Data":"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1"} Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.515765 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.555522 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.555506177 podStartE2EDuration="13.555506177s" podCreationTimestamp="2026-02-16 21:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:44.541207717 +0000 UTC m=+1245.123708819" watchObservedRunningTime="2026-02-16 21:58:44.555506177 +0000 UTC m=+1245.138007279" Feb 16 21:58:44 crc kubenswrapper[4777]: I0216 21:58:44.741225 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.513152 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerStarted","Data":"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.513727 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerStarted","Data":"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.513745 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerStarted","Data":"5bb503efcff6dd33d7b2d6780a3b8374341813c5efc91b9cf66e151ba5805df9"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.513799 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.521666 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerStarted","Data":"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.525472 4777 generic.go:334] "Generic (PLEG): container finished" podID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerID="050e8a4251be5be6fb141884a7f5c742176942bffeff1b64fa84b6483b125412" exitCode=0 Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.525934 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" event={"ID":"e59a46ad-39b7-4fdd-b942-aa753716c6a8","Type":"ContainerDied","Data":"050e8a4251be5be6fb141884a7f5c742176942bffeff1b64fa84b6483b125412"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.525998 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" event={"ID":"e59a46ad-39b7-4fdd-b942-aa753716c6a8","Type":"ContainerStarted","Data":"454b505091e6a44f60e1bce88a4764d3d9dab27440a6a9809a53816342ef4425"} Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.538332 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f8f666f74-96kzt" podStartSLOduration=2.538311143 podStartE2EDuration="2.538311143s" podCreationTimestamp="2026-02-16 21:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:45.534557858 +0000 UTC m=+1246.117058970" watchObservedRunningTime="2026-02-16 21:58:45.538311143 +0000 UTC m=+1246.120812255" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.562673 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.562652505 podStartE2EDuration="17.562652505s" podCreationTimestamp="2026-02-16 21:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:45.557219743 +0000 UTC m=+1246.139720845" watchObservedRunningTime="2026-02-16 21:58:45.562652505 +0000 UTC m=+1246.145153607" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.856708 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.859071 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.863472 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.863650 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.885100 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996182 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996243 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996367 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9vk\" (UniqueName: \"kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996581 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996644 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996669 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:45 crc kubenswrapper[4777]: I0216 21:58:45.996702 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098297 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098342 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098369 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098456 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098496 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098515 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9vk\" (UniqueName: \"kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.098562 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.106157 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.107565 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.107600 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.108267 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.111762 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.114232 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.117463 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9vk\" (UniqueName: \"kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk\") pod \"neutron-6f86f96cb9-gdgkm\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.199388 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.274525 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.303908 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6c9h\" (UniqueName: \"kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h\") pod \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.303990 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts\") pod \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.304025 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data\") pod \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.304087 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle\") pod \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.304119 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs\") pod \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\" (UID: \"332f9253-5c7a-4cc9-aead-9adc1fe86b2e\") " Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.304777 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs" (OuterVolumeSpecName: "logs") pod "332f9253-5c7a-4cc9-aead-9adc1fe86b2e" (UID: "332f9253-5c7a-4cc9-aead-9adc1fe86b2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.308309 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h" (OuterVolumeSpecName: "kube-api-access-b6c9h") pod "332f9253-5c7a-4cc9-aead-9adc1fe86b2e" (UID: "332f9253-5c7a-4cc9-aead-9adc1fe86b2e"). InnerVolumeSpecName "kube-api-access-b6c9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.311786 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts" (OuterVolumeSpecName: "scripts") pod "332f9253-5c7a-4cc9-aead-9adc1fe86b2e" (UID: "332f9253-5c7a-4cc9-aead-9adc1fe86b2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.375831 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data" (OuterVolumeSpecName: "config-data") pod "332f9253-5c7a-4cc9-aead-9adc1fe86b2e" (UID: "332f9253-5c7a-4cc9-aead-9adc1fe86b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.390875 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "332f9253-5c7a-4cc9-aead-9adc1fe86b2e" (UID: "332f9253-5c7a-4cc9-aead-9adc1fe86b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.406989 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6c9h\" (UniqueName: \"kubernetes.io/projected/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-kube-api-access-b6c9h\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.407018 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.407027 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.407037 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.407046 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332f9253-5c7a-4cc9-aead-9adc1fe86b2e-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.538440 4777 generic.go:334] "Generic (PLEG): container finished" podID="7010b1ce-fc40-4fc8-ae63-53368dfd55f9" containerID="6ced8a5ba17e392ecffdea1ece914209ced3a576c5624f5039c6f2cb8bb093bf" exitCode=0 Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.538497 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njzfj" event={"ID":"7010b1ce-fc40-4fc8-ae63-53368dfd55f9","Type":"ContainerDied","Data":"6ced8a5ba17e392ecffdea1ece914209ced3a576c5624f5039c6f2cb8bb093bf"} Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.549117 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" event={"ID":"e59a46ad-39b7-4fdd-b942-aa753716c6a8","Type":"ContainerStarted","Data":"3a2a013017b489637009a466c9634556c144ec7dc1e1cfd8df3a0a6e8d63c457"} Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.549255 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.552926 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pgthg" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.553080 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pgthg" event={"ID":"332f9253-5c7a-4cc9-aead-9adc1fe86b2e","Type":"ContainerDied","Data":"a16edc283f11001d268d46d49dd8056017f0754af4618977e086747b51fd6663"} Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.553127 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a16edc283f11001d268d46d49dd8056017f0754af4618977e086747b51fd6663" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.608841 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" podStartSLOduration=3.608823825 podStartE2EDuration="3.608823825s" podCreationTimestamp="2026-02-16 21:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:46.593459285 +0000 UTC m=+1247.175960377" watchObservedRunningTime="2026-02-16 21:58:46.608823825 +0000 UTC m=+1247.191324917" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.696763 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:58:46 crc kubenswrapper[4777]: E0216 21:58:46.697238 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332f9253-5c7a-4cc9-aead-9adc1fe86b2e" containerName="placement-db-sync" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.697251 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="332f9253-5c7a-4cc9-aead-9adc1fe86b2e" containerName="placement-db-sync" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.697456 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="332f9253-5c7a-4cc9-aead-9adc1fe86b2e" containerName="placement-db-sync" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.698635 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.701806 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.702004 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.702120 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qg6qv" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.702221 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.702377 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.707355 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814274 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814352 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ntrr\" (UniqueName: \"kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814410 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814448 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814468 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814491 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.814644 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.916936 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.916986 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ntrr\" (UniqueName: \"kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917024 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917059 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917077 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917119 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917200 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.917696 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.924622 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.927702 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.933074 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.936763 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ntrr\" (UniqueName: \"kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.936946 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:46 crc kubenswrapper[4777]: I0216 21:58:46.939943 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data\") pod \"placement-6f8865555d-rstbr\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:47 crc kubenswrapper[4777]: I0216 21:58:47.061293 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:48 crc kubenswrapper[4777]: I0216 21:58:48.950601 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:48 crc kubenswrapper[4777]: I0216 21:58:48.951175 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:48 crc kubenswrapper[4777]: I0216 21:58:48.988700 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:49 crc kubenswrapper[4777]: I0216 21:58:49.009575 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:49 crc kubenswrapper[4777]: E0216 21:58:49.183683 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:58:49 crc kubenswrapper[4777]: I0216 21:58:49.604731 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:49 crc kubenswrapper[4777]: I0216 21:58:49.604777 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.292952 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327260 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327446 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327495 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drfs\" (UniqueName: \"kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327570 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327640 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.327703 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys\") pod \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\" (UID: \"7010b1ce-fc40-4fc8-ae63-53368dfd55f9\") " Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.332163 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.332174 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs" (OuterVolumeSpecName: "kube-api-access-7drfs") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "kube-api-access-7drfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.332579 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts" (OuterVolumeSpecName: "scripts") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.342792 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.382761 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data" (OuterVolumeSpecName: "config-data") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.391854 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7010b1ce-fc40-4fc8-ae63-53368dfd55f9" (UID: "7010b1ce-fc40-4fc8-ae63-53368dfd55f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430565 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430599 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430610 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drfs\" (UniqueName: \"kubernetes.io/projected/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-kube-api-access-7drfs\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430619 4777 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430627 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.430635 4777 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7010b1ce-fc40-4fc8-ae63-53368dfd55f9-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.609896 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:58:51 crc kubenswrapper[4777]: W0216 21:58:51.613706 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d06dbb_1aa3_475f_bbc4_352bc009d3b9.slice/crio-2838d1c10fa666b0b9be6802131a1609d86d10c49576ec940a6eeb20bfbb17f7 WatchSource:0}: Error finding container 2838d1c10fa666b0b9be6802131a1609d86d10c49576ec940a6eeb20bfbb17f7: Status 404 returned error can't find the container with id 2838d1c10fa666b0b9be6802131a1609d86d10c49576ec940a6eeb20bfbb17f7 Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.626158 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerStarted","Data":"2838d1c10fa666b0b9be6802131a1609d86d10c49576ec940a6eeb20bfbb17f7"} Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.629940 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njzfj" event={"ID":"7010b1ce-fc40-4fc8-ae63-53368dfd55f9","Type":"ContainerDied","Data":"f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf"} Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.629977 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8da1433899be7df8334348d40dbc375dc40d3b3a88e563d46ed1dcf09708ddf" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.630035 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njzfj" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.642732 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerStarted","Data":"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233"} Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.680246 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:51 crc kubenswrapper[4777]: I0216 21:58:51.750418 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:58:51 crc kubenswrapper[4777]: W0216 21:58:51.756456 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bb43a06_461f_46ca_b3ed_419ee64ea40f.slice/crio-0c26b31254dd8e5a9aae5b289e3dadc106471bb63878ed7e39e61f06a8203bfb WatchSource:0}: Error finding container 0c26b31254dd8e5a9aae5b289e3dadc106471bb63878ed7e39e61f06a8203bfb: Status 404 returned error can't find the container with id 0c26b31254dd8e5a9aae5b289e3dadc106471bb63878ed7e39e61f06a8203bfb Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.022371 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.022412 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.096906 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.097415 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.507757 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-57454d8655-v6bgf"] Feb 16 21:58:52 crc kubenswrapper[4777]: E0216 21:58:52.508212 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010b1ce-fc40-4fc8-ae63-53368dfd55f9" containerName="keystone-bootstrap" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.508228 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010b1ce-fc40-4fc8-ae63-53368dfd55f9" containerName="keystone-bootstrap" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.508468 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010b1ce-fc40-4fc8-ae63-53368dfd55f9" containerName="keystone-bootstrap" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.510123 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.512245 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.512845 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.512856 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.512865 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.513631 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgg5d" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.515580 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.532653 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57454d8655-v6bgf"] Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.569385 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.671439 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-public-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.671851 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwpc\" (UniqueName: \"kubernetes.io/projected/142a3220-766a-49cc-bf87-9c879fc00222-kube-api-access-krwpc\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.671880 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-config-data\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.671940 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-credential-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.672000 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-internal-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.672219 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-fernet-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.672254 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-combined-ca-bundle\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.672344 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-scripts\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.689894 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerStarted","Data":"1270b0b911e5b618049966ffb4ab2a653fbf3f0e46723472dee8640b6326ac39"} Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.689951 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerStarted","Data":"518cbf48dfd1b55bc0d5bed9f412daa7dbb1c5792c0295cab9e0b4b3f2ddb3b7"} Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.690273 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.690327 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.692379 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerStarted","Data":"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a"} Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.692419 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerStarted","Data":"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05"} Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.692434 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerStarted","Data":"0c26b31254dd8e5a9aae5b289e3dadc106471bb63878ed7e39e61f06a8203bfb"} Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.693053 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.693090 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.739473 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f8865555d-rstbr" podStartSLOduration=6.739450986 podStartE2EDuration="6.739450986s" podCreationTimestamp="2026-02-16 21:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:52.719052775 +0000 UTC m=+1253.301553887" watchObservedRunningTime="2026-02-16 21:58:52.739450986 +0000 UTC m=+1253.321952108" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.756924 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f86f96cb9-gdgkm" podStartSLOduration=7.756885844 podStartE2EDuration="7.756885844s" podCreationTimestamp="2026-02-16 21:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:52.746534634 +0000 UTC m=+1253.329035736" watchObservedRunningTime="2026-02-16 21:58:52.756885844 +0000 UTC m=+1253.339386946" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.773879 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-fernet-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.773916 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-combined-ca-bundle\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.773959 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-scripts\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.774021 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-public-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.774064 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwpc\" (UniqueName: \"kubernetes.io/projected/142a3220-766a-49cc-bf87-9c879fc00222-kube-api-access-krwpc\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.774081 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-config-data\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.774157 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-credential-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.774200 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-internal-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.805561 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-credential-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.806634 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwpc\" (UniqueName: \"kubernetes.io/projected/142a3220-766a-49cc-bf87-9c879fc00222-kube-api-access-krwpc\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.807320 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-scripts\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.808111 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-public-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.808352 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-combined-ca-bundle\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.808313 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-fernet-keys\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.808829 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-config-data\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.809565 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/142a3220-766a-49cc-bf87-9c879fc00222-internal-tls-certs\") pod \"keystone-57454d8655-v6bgf\" (UID: \"142a3220-766a-49cc-bf87-9c879fc00222\") " pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:52 crc kubenswrapper[4777]: I0216 21:58:52.877540 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.360257 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57454d8655-v6bgf"] Feb 16 21:58:53 crc kubenswrapper[4777]: W0216 21:58:53.378951 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod142a3220_766a_49cc_bf87_9c879fc00222.slice/crio-2ed8b66ad4a8ecf258cb0d91bb8735daf91f87e52f6927492742da84851e2d4c WatchSource:0}: Error finding container 2ed8b66ad4a8ecf258cb0d91bb8735daf91f87e52f6927492742da84851e2d4c: Status 404 returned error can't find the container with id 2ed8b66ad4a8ecf258cb0d91bb8735daf91f87e52f6927492742da84851e2d4c Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.704917 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57454d8655-v6bgf" event={"ID":"142a3220-766a-49cc-bf87-9c879fc00222","Type":"ContainerStarted","Data":"b6591a3b015efdb795290d6e8c9f176f3ae77891eef78c3491b230874ef5c85b"} Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.704962 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57454d8655-v6bgf" event={"ID":"142a3220-766a-49cc-bf87-9c879fc00222","Type":"ContainerStarted","Data":"2ed8b66ad4a8ecf258cb0d91bb8735daf91f87e52f6927492742da84851e2d4c"} Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.704979 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.705684 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.732518 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-57454d8655-v6bgf" podStartSLOduration=1.7324985389999998 podStartE2EDuration="1.732498539s" podCreationTimestamp="2026-02-16 21:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:58:53.722881439 +0000 UTC m=+1254.305382541" watchObservedRunningTime="2026-02-16 21:58:53.732498539 +0000 UTC m=+1254.314999631" Feb 16 21:58:53 crc kubenswrapper[4777]: I0216 21:58:53.987936 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.044735 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.044952 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="dnsmasq-dns" containerID="cri-o://0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985" gracePeriod=10 Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.593309 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.712592 4777 generic.go:334] "Generic (PLEG): container finished" podID="ac680854-48f9-420b-8174-1a1766732f1a" containerID="0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985" exitCode=0 Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.713465 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.713970 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" event={"ID":"ac680854-48f9-420b-8174-1a1766732f1a","Type":"ContainerDied","Data":"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985"} Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.714017 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-jg86v" event={"ID":"ac680854-48f9-420b-8174-1a1766732f1a","Type":"ContainerDied","Data":"0f94f632a2c520daacc27f798100652f07d19803dd243eeaaea10910304f6078"} Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.714048 4777 scope.go:117] "RemoveContainer" containerID="0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717124 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717319 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717380 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717440 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq258\" (UniqueName: \"kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717473 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.717491 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config\") pod \"ac680854-48f9-420b-8174-1a1766732f1a\" (UID: \"ac680854-48f9-420b-8174-1a1766732f1a\") " Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.729931 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258" (OuterVolumeSpecName: "kube-api-access-jq258") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "kube-api-access-jq258". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.781260 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.788181 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.788275 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.811787 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config" (OuterVolumeSpecName: "config") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.813049 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac680854-48f9-420b-8174-1a1766732f1a" (UID: "ac680854-48f9-420b-8174-1a1766732f1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.819960 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.819989 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.819998 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq258\" (UniqueName: \"kubernetes.io/projected/ac680854-48f9-420b-8174-1a1766732f1a-kube-api-access-jq258\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.820009 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.820018 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.820029 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac680854-48f9-420b-8174-1a1766732f1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.828031 4777 scope.go:117] "RemoveContainer" containerID="b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.845791 4777 scope.go:117] "RemoveContainer" containerID="0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985" Feb 16 21:58:54 crc kubenswrapper[4777]: E0216 21:58:54.846401 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985\": container with ID starting with 0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985 not found: ID does not exist" containerID="0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.846430 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985"} err="failed to get container status \"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985\": rpc error: code = NotFound desc = could not find container \"0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985\": container with ID starting with 0f1952a58522cc38a854663b3da955fdf271d2aedeb93f801f13c627f9220985 not found: ID does not exist" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.846449 4777 scope.go:117] "RemoveContainer" containerID="b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2" Feb 16 21:58:54 crc kubenswrapper[4777]: E0216 21:58:54.846871 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2\": container with ID starting with b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2 not found: ID does not exist" containerID="b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.846895 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2"} err="failed to get container status \"b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2\": rpc error: code = NotFound desc = could not find container \"b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2\": container with ID starting with b19363ca8e15a52aa9e1a6160aca1d7abdd7fd3f0a5879f3a634940e2f4de8f2 not found: ID does not exist" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.936315 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.936469 4777 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 16 21:58:54 crc kubenswrapper[4777]: I0216 21:58:54.937743 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 16 21:58:55 crc kubenswrapper[4777]: I0216 21:58:55.075489 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:55 crc kubenswrapper[4777]: I0216 21:58:55.086929 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-jg86v"] Feb 16 21:58:55 crc kubenswrapper[4777]: I0216 21:58:55.726583 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl2w2" event={"ID":"4a4d7671-062a-4647-9a54-6933f7cc3a4d","Type":"ContainerStarted","Data":"fbcb8df8f0c47ccb68ea6be776538a34e1adffce08b1bf040774222cce989117"} Feb 16 21:58:55 crc kubenswrapper[4777]: I0216 21:58:55.747830 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pl2w2" podStartSLOduration=3.026614041 podStartE2EDuration="40.747812231s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="2026-02-16 21:58:16.917297652 +0000 UTC m=+1217.499798754" lastFinishedPulling="2026-02-16 21:58:54.638495832 +0000 UTC m=+1255.220996944" observedRunningTime="2026-02-16 21:58:55.739533989 +0000 UTC m=+1256.322035111" watchObservedRunningTime="2026-02-16 21:58:55.747812231 +0000 UTC m=+1256.330313333" Feb 16 21:58:56 crc kubenswrapper[4777]: I0216 21:58:56.194540 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac680854-48f9-420b-8174-1a1766732f1a" path="/var/lib/kubelet/pods/ac680854-48f9-420b-8174-1a1766732f1a/volumes" Feb 16 21:58:57 crc kubenswrapper[4777]: I0216 21:58:57.749226 4777 generic.go:334] "Generic (PLEG): container finished" podID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" containerID="fbcb8df8f0c47ccb68ea6be776538a34e1adffce08b1bf040774222cce989117" exitCode=0 Feb 16 21:58:57 crc kubenswrapper[4777]: I0216 21:58:57.749334 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl2w2" event={"ID":"4a4d7671-062a-4647-9a54-6933f7cc3a4d","Type":"ContainerDied","Data":"fbcb8df8f0c47ccb68ea6be776538a34e1adffce08b1bf040774222cce989117"} Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.288577 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.303890 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcdh6\" (UniqueName: \"kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6\") pod \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.303961 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle\") pod \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.304044 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data\") pod \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\" (UID: \"4a4d7671-062a-4647-9a54-6933f7cc3a4d\") " Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.311738 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6" (OuterVolumeSpecName: "kube-api-access-fcdh6") pod "4a4d7671-062a-4647-9a54-6933f7cc3a4d" (UID: "4a4d7671-062a-4647-9a54-6933f7cc3a4d"). InnerVolumeSpecName "kube-api-access-fcdh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.313794 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a4d7671-062a-4647-9a54-6933f7cc3a4d" (UID: "4a4d7671-062a-4647-9a54-6933f7cc3a4d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.353436 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a4d7671-062a-4647-9a54-6933f7cc3a4d" (UID: "4a4d7671-062a-4647-9a54-6933f7cc3a4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.405754 4777 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.405787 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4d7671-062a-4647-9a54-6933f7cc3a4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.405796 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcdh6\" (UniqueName: \"kubernetes.io/projected/4a4d7671-062a-4647-9a54-6933f7cc3a4d-kube-api-access-fcdh6\") on node \"crc\" DevicePath \"\"" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.773075 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl2w2" event={"ID":"4a4d7671-062a-4647-9a54-6933f7cc3a4d","Type":"ContainerDied","Data":"8398bc254479e0c3eae0e2cad34da419d9bfc19c26e641f79d274367ca625bf0"} Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.773469 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8398bc254479e0c3eae0e2cad34da419d9bfc19c26e641f79d274367ca625bf0" Feb 16 21:58:59 crc kubenswrapper[4777]: I0216 21:58:59.773542 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl2w2" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.093443 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6464bc7759-rjmxp"] Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.093871 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" containerName="barbican-db-sync" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.093888 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" containerName="barbican-db-sync" Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.093902 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="dnsmasq-dns" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.093908 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="dnsmasq-dns" Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.093923 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="init" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.093930 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="init" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.094105 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" containerName="barbican-db-sync" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.094138 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac680854-48f9-420b-8174-1a1766732f1a" containerName="dnsmasq-dns" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.095216 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.100503 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.100613 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.100842 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2xgvp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.109239 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6464bc7759-rjmxp"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.117920 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6492\" (UniqueName: \"kubernetes.io/projected/dcf235a4-5b60-4771-9654-1a02d4a9144d-kube-api-access-j6492\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.117965 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data-custom\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.117988 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.118054 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-combined-ca-bundle\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.118096 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf235a4-5b60-4771-9654-1a02d4a9144d-logs\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.220045 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6492\" (UniqueName: \"kubernetes.io/projected/dcf235a4-5b60-4771-9654-1a02d4a9144d-kube-api-access-j6492\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.220287 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data-custom\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.220311 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.220389 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-combined-ca-bundle\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.220442 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf235a4-5b60-4771-9654-1a02d4a9144d-logs\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.226495 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.226694 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.232392 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5494fbd488-n9m64"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.234624 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.232401 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf235a4-5b60-4771-9654-1a02d4a9144d-logs\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.240372 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-combined-ca-bundle\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.241970 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data-custom\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.243592 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6492\" (UniqueName: \"kubernetes.io/projected/dcf235a4-5b60-4771-9654-1a02d4a9144d-kube-api-access-j6492\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.247011 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5494fbd488-n9m64"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.250622 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.263314 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.263730 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf235a4-5b60-4771-9654-1a02d4a9144d-config-data\") pod \"barbican-worker-6464bc7759-rjmxp\" (UID: \"dcf235a4-5b60-4771-9654-1a02d4a9144d\") " pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.267772 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.304457 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.325280 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.325327 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.325440 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 21:59:00 crc kubenswrapper[4777]: E0216 21:59:00.327806 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335414 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335546 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335626 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsqz\" (UniqueName: \"kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335683 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335744 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.335912 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.411770 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.423120 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2xgvp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.429088 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.430758 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6464bc7759-rjmxp" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.432610 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.442279 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443635 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsqz\" (UniqueName: \"kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443670 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443698 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443764 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-combined-ca-bundle\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443791 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443820 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443875 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443892 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hvh\" (UniqueName: \"kubernetes.io/projected/3341e79f-f4de-47f5-8cc8-595f1b4fb837-kube-api-access-z5hvh\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.443930 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3341e79f-f4de-47f5-8cc8-595f1b4fb837-logs\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.445831 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.446323 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.446798 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data-custom\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.446859 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.447457 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.447615 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.457382 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.481539 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsqz\" (UniqueName: \"kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz\") pod \"dnsmasq-dns-85ff748b95-tcf6b\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547375 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547422 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hvh\" (UniqueName: \"kubernetes.io/projected/3341e79f-f4de-47f5-8cc8-595f1b4fb837-kube-api-access-z5hvh\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547441 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3341e79f-f4de-47f5-8cc8-595f1b4fb837-logs\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547469 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data-custom\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547496 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547519 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8glj\" (UniqueName: \"kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547579 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547607 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-combined-ca-bundle\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547733 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.547859 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.548286 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3341e79f-f4de-47f5-8cc8-595f1b4fb837-logs\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.551601 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data-custom\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.552769 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-combined-ca-bundle\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.557379 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3341e79f-f4de-47f5-8cc8-595f1b4fb837-config-data\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.565693 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hvh\" (UniqueName: \"kubernetes.io/projected/3341e79f-f4de-47f5-8cc8-595f1b4fb837-kube-api-access-z5hvh\") pod \"barbican-keystone-listener-5494fbd488-n9m64\" (UID: \"3341e79f-f4de-47f5-8cc8-595f1b4fb837\") " pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.621197 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.631410 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.655950 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.657730 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8glj\" (UniqueName: \"kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.657919 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.658092 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.658173 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.658349 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.664418 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.666208 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.668654 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.679880 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8glj\" (UniqueName: \"kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj\") pod \"barbican-api-586895bbf8-5lcdq\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.806923 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-central-agent" containerID="cri-o://601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f" gracePeriod=30 Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.807180 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerStarted","Data":"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59"} Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.807218 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.807446 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="proxy-httpd" containerID="cri-o://b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59" gracePeriod=30 Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.807491 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="sg-core" containerID="cri-o://f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233" gracePeriod=30 Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.807523 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-notification-agent" containerID="cri-o://6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f" gracePeriod=30 Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.822886 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z9tj6" event={"ID":"626f6429-977f-4c1f-b055-3502cb530645","Type":"ContainerStarted","Data":"c8b8f4f19e1088f06bd68a586cb548410b29090ae7567e1811d9628dd9caf510"} Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.845708 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.957690367 podStartE2EDuration="45.845690368s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="2026-02-16 21:58:17.868892703 +0000 UTC m=+1218.451393805" lastFinishedPulling="2026-02-16 21:58:59.756892684 +0000 UTC m=+1260.339393806" observedRunningTime="2026-02-16 21:59:00.842257332 +0000 UTC m=+1261.424758434" watchObservedRunningTime="2026-02-16 21:59:00.845690368 +0000 UTC m=+1261.428191460" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.890733 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-z9tj6" podStartSLOduration=2.639306393 podStartE2EDuration="45.890699699s" podCreationTimestamp="2026-02-16 21:58:15 +0000 UTC" firstStartedPulling="2026-02-16 21:58:16.477604567 +0000 UTC m=+1217.060105669" lastFinishedPulling="2026-02-16 21:58:59.728997883 +0000 UTC m=+1260.311498975" observedRunningTime="2026-02-16 21:59:00.859967758 +0000 UTC m=+1261.442468860" watchObservedRunningTime="2026-02-16 21:59:00.890699699 +0000 UTC m=+1261.473200791" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.909694 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:00 crc kubenswrapper[4777]: I0216 21:59:00.909998 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6464bc7759-rjmxp"] Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.243591 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.301854 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5494fbd488-n9m64"] Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.475318 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:01 crc kubenswrapper[4777]: W0216 21:59:01.483976 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod866497a0_a152_4937_b43e_45aec58b4fea.slice/crio-d4cd18bcc1bdbb77fbfd377acc88c191900e364eac6b328e3ad21b967e2675d0 WatchSource:0}: Error finding container d4cd18bcc1bdbb77fbfd377acc88c191900e364eac6b328e3ad21b967e2675d0: Status 404 returned error can't find the container with id d4cd18bcc1bdbb77fbfd377acc88c191900e364eac6b328e3ad21b967e2675d0 Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.834054 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6464bc7759-rjmxp" event={"ID":"dcf235a4-5b60-4771-9654-1a02d4a9144d","Type":"ContainerStarted","Data":"5d71031a1aa5eb4b81f5e1fa7d15263b1b3c6e9359087ecc2b7af3f1f9cbc7b5"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.836386 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerStarted","Data":"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.836410 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerStarted","Data":"d4cd18bcc1bdbb77fbfd377acc88c191900e364eac6b328e3ad21b967e2675d0"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.838678 4777 generic.go:334] "Generic (PLEG): container finished" podID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerID="c83d566f60af0e46e4ab95fa288566cf666e3ddc94b152bb1a7645b5ecebf9bc" exitCode=0 Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.838762 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" event={"ID":"17462730-ddcb-4b67-9d41-30ccec27e89c","Type":"ContainerDied","Data":"c83d566f60af0e46e4ab95fa288566cf666e3ddc94b152bb1a7645b5ecebf9bc"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.838820 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" event={"ID":"17462730-ddcb-4b67-9d41-30ccec27e89c","Type":"ContainerStarted","Data":"388dd02c221c32b0668b769329c58c148546e2816c6562f0064f447af839b1c7"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852364 4777 generic.go:334] "Generic (PLEG): container finished" podID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerID="b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59" exitCode=0 Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852402 4777 generic.go:334] "Generic (PLEG): container finished" podID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerID="f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233" exitCode=2 Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852414 4777 generic.go:334] "Generic (PLEG): container finished" podID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerID="601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f" exitCode=0 Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852452 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerDied","Data":"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852492 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerDied","Data":"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.852507 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerDied","Data":"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f"} Feb 16 21:59:01 crc kubenswrapper[4777]: I0216 21:59:01.854002 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" event={"ID":"3341e79f-f4de-47f5-8cc8-595f1b4fb837","Type":"ContainerStarted","Data":"fdc1638d5a7edc7b024f1640df12e009ac0d280255970e459135648e05f647d9"} Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.874095 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6464bc7759-rjmxp" event={"ID":"dcf235a4-5b60-4771-9654-1a02d4a9144d","Type":"ContainerStarted","Data":"c59125aab3b8f87216f946eae57c08b7877912a2d02e5c633fde96651116734f"} Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.874730 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6464bc7759-rjmxp" event={"ID":"dcf235a4-5b60-4771-9654-1a02d4a9144d","Type":"ContainerStarted","Data":"2ecd08e7e753ee4c6e66d6e6d3b6d6e847d16aa6d688d765960f1ca297cf4d2f"} Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.881191 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerStarted","Data":"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede"} Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.881268 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.881306 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.885405 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" event={"ID":"17462730-ddcb-4b67-9d41-30ccec27e89c","Type":"ContainerStarted","Data":"cab5daac83f97dcf28a00c004b3c4d8bc3c45cbdad27f0f2623668abd1372dff"} Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.885584 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.907208 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6464bc7759-rjmxp" podStartSLOduration=1.6705777720000001 podStartE2EDuration="2.907191945s" podCreationTimestamp="2026-02-16 21:59:00 +0000 UTC" firstStartedPulling="2026-02-16 21:59:00.885484703 +0000 UTC m=+1261.467985805" lastFinishedPulling="2026-02-16 21:59:02.122098876 +0000 UTC m=+1262.704599978" observedRunningTime="2026-02-16 21:59:02.896079903 +0000 UTC m=+1263.478581025" watchObservedRunningTime="2026-02-16 21:59:02.907191945 +0000 UTC m=+1263.489693047" Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.941078 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-586895bbf8-5lcdq" podStartSLOduration=2.941053313 podStartE2EDuration="2.941053313s" podCreationTimestamp="2026-02-16 21:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:02.934827119 +0000 UTC m=+1263.517328221" watchObservedRunningTime="2026-02-16 21:59:02.941053313 +0000 UTC m=+1263.523554445" Feb 16 21:59:02 crc kubenswrapper[4777]: I0216 21:59:02.959064 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" podStartSLOduration=2.959046567 podStartE2EDuration="2.959046567s" podCreationTimestamp="2026-02-16 21:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:02.951159326 +0000 UTC m=+1263.533660428" watchObservedRunningTime="2026-02-16 21:59:02.959046567 +0000 UTC m=+1263.541547669" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.045386 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b898f6696-7cphs"] Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.048324 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.052249 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.052451 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.058073 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b898f6696-7cphs"] Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.106867 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.106944 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data-custom\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.106994 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-public-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.107022 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf8e71c-6c4e-4724-8d06-46aed549c48f-logs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.107105 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfdrf\" (UniqueName: \"kubernetes.io/projected/fcf8e71c-6c4e-4724-8d06-46aed549c48f-kube-api-access-nfdrf\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.107134 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-combined-ca-bundle\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.107163 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-internal-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208411 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-public-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208462 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf8e71c-6c4e-4724-8d06-46aed549c48f-logs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208543 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfdrf\" (UniqueName: \"kubernetes.io/projected/fcf8e71c-6c4e-4724-8d06-46aed549c48f-kube-api-access-nfdrf\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208567 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-combined-ca-bundle\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208592 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-internal-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208675 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208724 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data-custom\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.208935 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf8e71c-6c4e-4724-8d06-46aed549c48f-logs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.214126 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-public-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.215358 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-combined-ca-bundle\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.215539 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.216002 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-internal-tls-certs\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.219814 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcf8e71c-6c4e-4724-8d06-46aed549c48f-config-data-custom\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.230471 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfdrf\" (UniqueName: \"kubernetes.io/projected/fcf8e71c-6c4e-4724-8d06-46aed549c48f-kube-api-access-nfdrf\") pod \"barbican-api-5b898f6696-7cphs\" (UID: \"fcf8e71c-6c4e-4724-8d06-46aed549c48f\") " pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.388740 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.896063 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" event={"ID":"3341e79f-f4de-47f5-8cc8-595f1b4fb837","Type":"ContainerStarted","Data":"0da589b11016259f95eb1dc95719e0bfdf861ec4cb755a06bb89b8b404bd2b3e"} Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.896395 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" event={"ID":"3341e79f-f4de-47f5-8cc8-595f1b4fb837","Type":"ContainerStarted","Data":"a2cf702787df8744e57c4d0f170af6ca0498b102eb475e1b972dce56e60ff509"} Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.908290 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b898f6696-7cphs"] Feb 16 21:59:03 crc kubenswrapper[4777]: I0216 21:59:03.937142 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5494fbd488-n9m64" podStartSLOduration=1.9371106249999999 podStartE2EDuration="3.93711508s" podCreationTimestamp="2026-02-16 21:59:00 +0000 UTC" firstStartedPulling="2026-02-16 21:59:01.304554019 +0000 UTC m=+1261.887055121" lastFinishedPulling="2026-02-16 21:59:03.304558444 +0000 UTC m=+1263.887059576" observedRunningTime="2026-02-16 21:59:03.922249944 +0000 UTC m=+1264.504751046" watchObservedRunningTime="2026-02-16 21:59:03.93711508 +0000 UTC m=+1264.519616182" Feb 16 21:59:03 crc kubenswrapper[4777]: W0216 21:59:03.940264 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf8e71c_6c4e_4724_8d06_46aed549c48f.slice/crio-26c8f4e4b340940ab94d3ee3c421b56045948b086a64a409c5854122d339d2b7 WatchSource:0}: Error finding container 26c8f4e4b340940ab94d3ee3c421b56045948b086a64a409c5854122d339d2b7: Status 404 returned error can't find the container with id 26c8f4e4b340940ab94d3ee3c421b56045948b086a64a409c5854122d339d2b7 Feb 16 21:59:04 crc kubenswrapper[4777]: I0216 21:59:04.907342 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b898f6696-7cphs" event={"ID":"fcf8e71c-6c4e-4724-8d06-46aed549c48f","Type":"ContainerStarted","Data":"bc1949f98f9a90c276236ebe2fec40eb3253485a1d5b3d56b0615aead9703a05"} Feb 16 21:59:04 crc kubenswrapper[4777]: I0216 21:59:04.908667 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b898f6696-7cphs" event={"ID":"fcf8e71c-6c4e-4724-8d06-46aed549c48f","Type":"ContainerStarted","Data":"6a8b4fd4ea3a66701be5c4d8512a06bbd96a604e3b3a0eaa4e1866eb7a1f143e"} Feb 16 21:59:04 crc kubenswrapper[4777]: I0216 21:59:04.908806 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b898f6696-7cphs" event={"ID":"fcf8e71c-6c4e-4724-8d06-46aed549c48f","Type":"ContainerStarted","Data":"26c8f4e4b340940ab94d3ee3c421b56045948b086a64a409c5854122d339d2b7"} Feb 16 21:59:04 crc kubenswrapper[4777]: I0216 21:59:04.933826 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b898f6696-7cphs" podStartSLOduration=1.933804824 podStartE2EDuration="1.933804824s" podCreationTimestamp="2026-02-16 21:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:04.929671868 +0000 UTC m=+1265.512172980" watchObservedRunningTime="2026-02-16 21:59:04.933804824 +0000 UTC m=+1265.516305936" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.694306 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.874784 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.874895 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.874982 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.875013 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.875260 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.875291 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.875346 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8s8j\" (UniqueName: \"kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j\") pod \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\" (UID: \"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56\") " Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.875857 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.876119 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.876445 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.876476 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.883176 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts" (OuterVolumeSpecName: "scripts") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.883994 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j" (OuterVolumeSpecName: "kube-api-access-z8s8j") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "kube-api-access-z8s8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.924571 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.927514 4777 generic.go:334] "Generic (PLEG): container finished" podID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerID="6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f" exitCode=0 Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.927612 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerDied","Data":"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f"} Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.927644 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fe0aceb-b5ac-4807-b60e-4e43f9b13b56","Type":"ContainerDied","Data":"453fce8f4b05bf10c4eedaae590d937b40e45e5609e102db6ff9eae1ef01a333"} Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.927669 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.927687 4777 scope.go:117] "RemoveContainer" containerID="b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.931629 4777 generic.go:334] "Generic (PLEG): container finished" podID="626f6429-977f-4c1f-b055-3502cb530645" containerID="c8b8f4f19e1088f06bd68a586cb548410b29090ae7567e1811d9628dd9caf510" exitCode=0 Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.931690 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z9tj6" event={"ID":"626f6429-977f-4c1f-b055-3502cb530645","Type":"ContainerDied","Data":"c8b8f4f19e1088f06bd68a586cb548410b29090ae7567e1811d9628dd9caf510"} Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.932122 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.932172 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.979942 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8s8j\" (UniqueName: \"kubernetes.io/projected/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-kube-api-access-z8s8j\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.979976 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:05 crc kubenswrapper[4777]: I0216 21:59:05.979990 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.021728 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.043516 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data" (OuterVolumeSpecName: "config-data") pod "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" (UID: "0fe0aceb-b5ac-4807-b60e-4e43f9b13b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.082205 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.082234 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.087438 4777 scope.go:117] "RemoveContainer" containerID="f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.112985 4777 scope.go:117] "RemoveContainer" containerID="6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.141829 4777 scope.go:117] "RemoveContainer" containerID="601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.171521 4777 scope.go:117] "RemoveContainer" containerID="b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.172374 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59\": container with ID starting with b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59 not found: ID does not exist" containerID="b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.172438 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59"} err="failed to get container status \"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59\": rpc error: code = NotFound desc = could not find container \"b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59\": container with ID starting with b55f553f8f197bfd26d8f1b056b13d0fd772f343f4e5536a5a269d60af0f9d59 not found: ID does not exist" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.172472 4777 scope.go:117] "RemoveContainer" containerID="f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.173152 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233\": container with ID starting with f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233 not found: ID does not exist" containerID="f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.173215 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233"} err="failed to get container status \"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233\": rpc error: code = NotFound desc = could not find container \"f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233\": container with ID starting with f4242dffa734ef9323896c8592e32a9cea5ff1165096733db17e849e78dbd233 not found: ID does not exist" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.173259 4777 scope.go:117] "RemoveContainer" containerID="6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.174114 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f\": container with ID starting with 6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f not found: ID does not exist" containerID="6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.174168 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f"} err="failed to get container status \"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f\": rpc error: code = NotFound desc = could not find container \"6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f\": container with ID starting with 6a7fb515c9b7b5fe68a1b9897b2d7f0c8a6ed67224a0bee294829eb603a47e6f not found: ID does not exist" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.174187 4777 scope.go:117] "RemoveContainer" containerID="601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.174774 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f\": container with ID starting with 601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f not found: ID does not exist" containerID="601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.174850 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f"} err="failed to get container status \"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f\": rpc error: code = NotFound desc = could not find container \"601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f\": container with ID starting with 601071ede11939b147a7870290a6c362d870d2a3dc4e06e6966bce18456cb77f not found: ID does not exist" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.301947 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.312144 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.337700 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.338260 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="proxy-httpd" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338298 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="proxy-httpd" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.338332 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-notification-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338341 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-notification-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.338357 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="sg-core" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338365 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="sg-core" Feb 16 21:59:06 crc kubenswrapper[4777]: E0216 21:59:06.338391 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-central-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338401 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-central-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338629 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-notification-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338651 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="sg-core" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338663 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="ceilometer-central-agent" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.338679 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" containerName="proxy-httpd" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.341779 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.345693 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.345892 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.352016 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.391388 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.391739 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.391857 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.391980 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.392015 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.392111 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scm8t\" (UniqueName: \"kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.392144 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.494585 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scm8t\" (UniqueName: \"kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.494988 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.495163 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.495394 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.495636 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.495770 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.496022 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.496160 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.496310 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.503778 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.506265 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.506880 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.512382 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.523563 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scm8t\" (UniqueName: \"kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t\") pod \"ceilometer-0\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " pod="openstack/ceilometer-0" Feb 16 21:59:06 crc kubenswrapper[4777]: I0216 21:59:06.659740 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.208941 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.402156 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.517211 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.517337 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.517767 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.519523 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4k2\" (UniqueName: \"kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.519706 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.524307 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.524384 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle\") pod \"626f6429-977f-4c1f-b055-3502cb530645\" (UID: \"626f6429-977f-4c1f-b055-3502cb530645\") " Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.525313 4777 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626f6429-977f-4c1f-b055-3502cb530645-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.525843 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.528563 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts" (OuterVolumeSpecName: "scripts") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.531451 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2" (OuterVolumeSpecName: "kube-api-access-zg4k2") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "kube-api-access-zg4k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.560170 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.595868 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data" (OuterVolumeSpecName: "config-data") pod "626f6429-977f-4c1f-b055-3502cb530645" (UID: "626f6429-977f-4c1f-b055-3502cb530645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.626780 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.626823 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.626842 4777 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.626857 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4k2\" (UniqueName: \"kubernetes.io/projected/626f6429-977f-4c1f-b055-3502cb530645-kube-api-access-zg4k2\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.626872 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626f6429-977f-4c1f-b055-3502cb530645-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.965753 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerStarted","Data":"25de2aaa8c60033eabacde3caf311d985aeed02bed737d7ce787d66bd84b2410"} Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.965825 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerStarted","Data":"873a356f3d9b329d5af27e348552b6566f7a8864aef613dfba89cacc48642fcd"} Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.967969 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-z9tj6" event={"ID":"626f6429-977f-4c1f-b055-3502cb530645","Type":"ContainerDied","Data":"7c703b74e9f817e86c6ec11e94c6e08afd9368aa409198183eadfb7184ef2566"} Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.968010 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c703b74e9f817e86c6ec11e94c6e08afd9368aa409198183eadfb7184ef2566" Feb 16 21:59:07 crc kubenswrapper[4777]: I0216 21:59:07.968048 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-z9tj6" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.200706 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe0aceb-b5ac-4807-b60e-4e43f9b13b56" path="/var/lib/kubelet/pods/0fe0aceb-b5ac-4807-b60e-4e43f9b13b56/volumes" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.453460 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.453690 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="dnsmasq-dns" containerID="cri-o://cab5daac83f97dcf28a00c004b3c4d8bc3c45cbdad27f0f2623668abd1372dff" gracePeriod=10 Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.464808 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.516729 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 21:59:08 crc kubenswrapper[4777]: E0216 21:59:08.517149 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626f6429-977f-4c1f-b055-3502cb530645" containerName="cinder-db-sync" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.517166 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="626f6429-977f-4c1f-b055-3502cb530645" containerName="cinder-db-sync" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.517372 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="626f6429-977f-4c1f-b055-3502cb530645" containerName="cinder-db-sync" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.518477 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.547312 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.550454 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.555851 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.556160 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.556311 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nbmqh" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.556425 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.561975 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.570808 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653107 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653375 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653423 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653442 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653479 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnjp\" (UniqueName: \"kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653543 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653564 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653594 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653610 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653630 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653650 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wwn\" (UniqueName: \"kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.653696 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.661118 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.667185 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.670087 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.689945 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.757203 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.757383 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27wwn\" (UniqueName: \"kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.757534 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.757677 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwcp\" (UniqueName: \"kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.757872 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.758018 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.758146 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.758261 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.758386 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.758557 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.763149 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.772859 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wwn\" (UniqueName: \"kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773529 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773626 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773650 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773670 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773699 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scnjp\" (UniqueName: \"kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773769 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773806 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773875 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773897 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.773923 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.774160 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.774828 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.775085 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.775105 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.775927 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config\") pod \"dnsmasq-dns-5c9776ccc5-m28rz\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.779983 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.780448 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.781428 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.789880 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnjp\" (UniqueName: \"kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp\") pod \"cinder-scheduler-0\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876009 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwcp\" (UniqueName: \"kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876274 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876360 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876498 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876626 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876749 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.876862 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.877581 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.878088 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.880499 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.885014 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.885489 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.886653 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.895420 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwcp\" (UniqueName: \"kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp\") pod \"cinder-api-0\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " pod="openstack/cinder-api-0" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.967041 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.979286 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerStarted","Data":"982413fbef394604ecd4fd9c043b24a9c8d514c433fa28f79f33b633c5012ffb"} Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.981557 4777 generic.go:334] "Generic (PLEG): container finished" podID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerID="cab5daac83f97dcf28a00c004b3c4d8bc3c45cbdad27f0f2623668abd1372dff" exitCode=0 Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.981597 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" event={"ID":"17462730-ddcb-4b67-9d41-30ccec27e89c","Type":"ContainerDied","Data":"cab5daac83f97dcf28a00c004b3c4d8bc3c45cbdad27f0f2623668abd1372dff"} Feb 16 21:59:08 crc kubenswrapper[4777]: I0216 21:59:08.998880 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.002164 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.081991 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.191458 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.191849 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpsqz\" (UniqueName: \"kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.191982 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.192078 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.194744 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.194883 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb\") pod \"17462730-ddcb-4b67-9d41-30ccec27e89c\" (UID: \"17462730-ddcb-4b67-9d41-30ccec27e89c\") " Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.218688 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz" (OuterVolumeSpecName: "kube-api-access-zpsqz") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "kube-api-access-zpsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.277032 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.288720 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config" (OuterVolumeSpecName: "config") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.291166 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.294107 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.299897 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.300072 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpsqz\" (UniqueName: \"kubernetes.io/projected/17462730-ddcb-4b67-9d41-30ccec27e89c-kube-api-access-zpsqz\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.300173 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.300249 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.300347 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.337429 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17462730-ddcb-4b67-9d41-30ccec27e89c" (UID: "17462730-ddcb-4b67-9d41-30ccec27e89c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.402036 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17462730-ddcb-4b67-9d41-30ccec27e89c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.699544 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.762915 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 21:59:09 crc kubenswrapper[4777]: I0216 21:59:09.978342 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.010985 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerStarted","Data":"78c979207458f09fc38e1370ee3ffac5e18deaba45567a1d7e6573cb611015da"} Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.012514 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerStarted","Data":"58287e0a8931a5f4864bc6f8c253b2dcde56392b6d35ba26552c21b4533acd11"} Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.014568 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerStarted","Data":"7404aa1bea0ee2b6b3cda7dd60fd23a44420edc6f355ad067a5f9379ecef4d4b"} Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.016948 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.019892 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tcf6b" event={"ID":"17462730-ddcb-4b67-9d41-30ccec27e89c","Type":"ContainerDied","Data":"388dd02c221c32b0668b769329c58c148546e2816c6562f0064f447af839b1c7"} Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.020006 4777 scope.go:117] "RemoveContainer" containerID="cab5daac83f97dcf28a00c004b3c4d8bc3c45cbdad27f0f2623668abd1372dff" Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.026837 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" event={"ID":"d135dd81-22b2-4c1d-9d7b-bc07f225abbb","Type":"ContainerStarted","Data":"046eb717af186b7dfa3e186ef806d6f07ad5563fb2e1f29a5d5cf3797e3cac96"} Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.065151 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.065572 4777 scope.go:117] "RemoveContainer" containerID="c83d566f60af0e46e4ab95fa288566cf666e3ddc94b152bb1a7645b5ecebf9bc" Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.074379 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tcf6b"] Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.204949 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" path="/var/lib/kubelet/pods/17462730-ddcb-4b67-9d41-30ccec27e89c/volumes" Feb 16 21:59:10 crc kubenswrapper[4777]: I0216 21:59:10.415698 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:11 crc kubenswrapper[4777]: I0216 21:59:11.042026 4777 generic.go:334] "Generic (PLEG): container finished" podID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerID="e6cbb7362d54d9d994bf5f123cffa5f8f931039cfea13a48c3a848a8cf0c7aec" exitCode=0 Feb 16 21:59:11 crc kubenswrapper[4777]: I0216 21:59:11.042283 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" event={"ID":"d135dd81-22b2-4c1d-9d7b-bc07f225abbb","Type":"ContainerDied","Data":"e6cbb7362d54d9d994bf5f123cffa5f8f931039cfea13a48c3a848a8cf0c7aec"} Feb 16 21:59:11 crc kubenswrapper[4777]: E0216 21:59:11.183600 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:59:12 crc kubenswrapper[4777]: I0216 21:59:12.051968 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerStarted","Data":"78881c3659493c7079f93fa1057574cdc44c1cdd9aae67eb0127eeb20fc1f778"} Feb 16 21:59:12 crc kubenswrapper[4777]: I0216 21:59:12.260293 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:12 crc kubenswrapper[4777]: I0216 21:59:12.370574 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.078491 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerStarted","Data":"424c490251c5b3357734ae2216cf462f87c1b460d26b3891b1b1761e8527925f"} Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.099817 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" event={"ID":"d135dd81-22b2-4c1d-9d7b-bc07f225abbb","Type":"ContainerStarted","Data":"6435afb5bc5b7a6643a652deb5a732e92ac2c0d0a5b48c9e0ab848d9688a7813"} Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.101010 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.107807 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerStarted","Data":"9f7c3422cde3af1689c80a14a278107ca3de3842eb8aedde6982c631aed16a53"} Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.107965 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.111026 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerStarted","Data":"ef4f14124dffb958ed3c2a29c02c7b07e2bbb0f1414bd61882ba057b313b2fef"} Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.111149 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api-log" containerID="cri-o://78881c3659493c7079f93fa1057574cdc44c1cdd9aae67eb0127eeb20fc1f778" gracePeriod=30 Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.111238 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api" containerID="cri-o://ef4f14124dffb958ed3c2a29c02c7b07e2bbb0f1414bd61882ba057b313b2fef" gracePeriod=30 Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.173068 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.210406142 podStartE2EDuration="7.173049691s" podCreationTimestamp="2026-02-16 21:59:06 +0000 UTC" firstStartedPulling="2026-02-16 21:59:07.209674535 +0000 UTC m=+1267.792175807" lastFinishedPulling="2026-02-16 21:59:12.172318234 +0000 UTC m=+1272.754819356" observedRunningTime="2026-02-16 21:59:13.165828799 +0000 UTC m=+1273.748329901" watchObservedRunningTime="2026-02-16 21:59:13.173049691 +0000 UTC m=+1273.755550783" Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.174620 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" podStartSLOduration=5.174612795 podStartE2EDuration="5.174612795s" podCreationTimestamp="2026-02-16 21:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:13.141553569 +0000 UTC m=+1273.724054671" watchObservedRunningTime="2026-02-16 21:59:13.174612795 +0000 UTC m=+1273.757113897" Feb 16 21:59:13 crc kubenswrapper[4777]: I0216 21:59:13.999341 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.087493 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.123304 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.123291215 podStartE2EDuration="6.123291215s" podCreationTimestamp="2026-02-16 21:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:13.188946017 +0000 UTC m=+1273.771447119" watchObservedRunningTime="2026-02-16 21:59:14.123291215 +0000 UTC m=+1274.705792317" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.164259 4777 generic.go:334] "Generic (PLEG): container finished" podID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerID="78881c3659493c7079f93fa1057574cdc44c1cdd9aae67eb0127eeb20fc1f778" exitCode=143 Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.164333 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerDied","Data":"78881c3659493c7079f93fa1057574cdc44c1cdd9aae67eb0127eeb20fc1f778"} Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.178951 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerStarted","Data":"7d5423d6ea4f6db4862db96f31540776e22f749e0416cd3800f9b5c30f477978"} Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.232114 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.00325864 podStartE2EDuration="6.232097963s" podCreationTimestamp="2026-02-16 21:59:08 +0000 UTC" firstStartedPulling="2026-02-16 21:59:09.984647624 +0000 UTC m=+1270.567148726" lastFinishedPulling="2026-02-16 21:59:12.213486937 +0000 UTC m=+1272.795988049" observedRunningTime="2026-02-16 21:59:14.226728142 +0000 UTC m=+1274.809229244" watchObservedRunningTime="2026-02-16 21:59:14.232097963 +0000 UTC m=+1274.814599065" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.334673 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.335150 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f86f96cb9-gdgkm" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-api" containerID="cri-o://f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05" gracePeriod=30 Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.337460 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f86f96cb9-gdgkm" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" containerID="cri-o://1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a" gracePeriod=30 Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.369964 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dd8c5d5-n2w9s"] Feb 16 21:59:14 crc kubenswrapper[4777]: E0216 21:59:14.370610 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="dnsmasq-dns" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.370749 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="dnsmasq-dns" Feb 16 21:59:14 crc kubenswrapper[4777]: E0216 21:59:14.371786 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="init" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.371910 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="init" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.372236 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="17462730-ddcb-4b67-9d41-30ccec27e89c" containerName="dnsmasq-dns" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.374453 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.395290 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dd8c5d5-n2w9s"] Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.443601 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f86f96cb9-gdgkm" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.172:9696/\": read tcp 10.217.0.2:52970->10.217.0.172:9696: read: connection reset by peer" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.478899 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-internal-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.478949 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-combined-ca-bundle\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.479001 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-public-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.479139 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.479421 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-httpd-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.479515 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfwgs\" (UniqueName: \"kubernetes.io/projected/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-kube-api-access-vfwgs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.479700 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-ovndb-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.581849 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-ovndb-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582200 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-internal-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582286 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-combined-ca-bundle\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582374 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-public-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582704 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-httpd-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.582931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfwgs\" (UniqueName: \"kubernetes.io/projected/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-kube-api-access-vfwgs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.588784 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-httpd-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.588967 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-ovndb-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.589464 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-combined-ca-bundle\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.590526 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-public-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.593693 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-internal-tls-certs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.595357 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-config\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.598272 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfwgs\" (UniqueName: \"kubernetes.io/projected/2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8-kube-api-access-vfwgs\") pod \"neutron-5dd8c5d5-n2w9s\" (UID: \"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8\") " pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:14 crc kubenswrapper[4777]: I0216 21:59:14.704165 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.078150 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.116106 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b898f6696-7cphs" Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.176595 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.177164 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-586895bbf8-5lcdq" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api-log" containerID="cri-o://3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192" gracePeriod=30 Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.177805 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-586895bbf8-5lcdq" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api" containerID="cri-o://7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede" gracePeriod=30 Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.197246 4777 generic.go:334] "Generic (PLEG): container finished" podID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerID="1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a" exitCode=0 Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.197442 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerDied","Data":"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a"} Feb 16 21:59:15 crc kubenswrapper[4777]: W0216 21:59:15.347910 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e1b22d3_1ea0_4f32_92be_3ab6b56a90b8.slice/crio-e76c7b5dcbf978098d3b09d7f612031fb6328fc8a2a88bb01ab14c95ebd0bb61 WatchSource:0}: Error finding container e76c7b5dcbf978098d3b09d7f612031fb6328fc8a2a88bb01ab14c95ebd0bb61: Status 404 returned error can't find the container with id e76c7b5dcbf978098d3b09d7f612031fb6328fc8a2a88bb01ab14c95ebd0bb61 Feb 16 21:59:15 crc kubenswrapper[4777]: I0216 21:59:15.348005 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dd8c5d5-n2w9s"] Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.215700 4777 generic.go:334] "Generic (PLEG): container finished" podID="866497a0-a152-4937-b43e-45aec58b4fea" containerID="3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192" exitCode=143 Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.215745 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerDied","Data":"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192"} Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.217381 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd8c5d5-n2w9s" event={"ID":"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8","Type":"ContainerStarted","Data":"8329c8f54733adc6d7fb9f2b8b71ea9fbf6019ffadfd135baabfd5a1175c900d"} Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.217413 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd8c5d5-n2w9s" event={"ID":"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8","Type":"ContainerStarted","Data":"ff9eb06ecf54f21617d30e0f40e239e66eff3e6671b248095bdb51bccbbb211b"} Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.217431 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd8c5d5-n2w9s" event={"ID":"2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8","Type":"ContainerStarted","Data":"e76c7b5dcbf978098d3b09d7f612031fb6328fc8a2a88bb01ab14c95ebd0bb61"} Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.217513 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.240263 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dd8c5d5-n2w9s" podStartSLOduration=2.2402429440000002 podStartE2EDuration="2.240242944s" podCreationTimestamp="2026-02-16 21:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:16.232150488 +0000 UTC m=+1276.814651610" watchObservedRunningTime="2026-02-16 21:59:16.240242944 +0000 UTC m=+1276.822744056" Feb 16 21:59:16 crc kubenswrapper[4777]: I0216 21:59:16.275977 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f86f96cb9-gdgkm" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.172:9696/\": dial tcp 10.217.0.172:9696: connect: connection refused" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.067211 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.068045 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.315431 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57449bfc86-nk7r7"] Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.321269 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.339660 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57449bfc86-nk7r7"] Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.363092 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-586895bbf8-5lcdq" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:57674->10.217.0.178:9311: read: connection reset by peer" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.363141 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-586895bbf8-5lcdq" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:57676->10.217.0.178:9311: read: connection reset by peer" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392200 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-public-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392288 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8f6\" (UniqueName: \"kubernetes.io/projected/73dc4ce5-0c6e-492a-84b5-097a3defc481-kube-api-access-sf8f6\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392332 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-internal-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392388 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73dc4ce5-0c6e-492a-84b5-097a3defc481-logs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392417 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-config-data\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392825 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-scripts\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.392896 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-combined-ca-bundle\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495033 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-combined-ca-bundle\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495353 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-public-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495396 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf8f6\" (UniqueName: \"kubernetes.io/projected/73dc4ce5-0c6e-492a-84b5-097a3defc481-kube-api-access-sf8f6\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495434 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-internal-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495481 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73dc4ce5-0c6e-492a-84b5-097a3defc481-logs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495504 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-config-data\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.495532 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-scripts\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.496411 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73dc4ce5-0c6e-492a-84b5-097a3defc481-logs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.501028 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-scripts\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.503318 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-public-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.504399 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-combined-ca-bundle\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.504458 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-config-data\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.516059 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf8f6\" (UniqueName: \"kubernetes.io/projected/73dc4ce5-0c6e-492a-84b5-097a3defc481-kube-api-access-sf8f6\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.521548 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73dc4ce5-0c6e-492a-84b5-097a3defc481-internal-tls-certs\") pod \"placement-57449bfc86-nk7r7\" (UID: \"73dc4ce5-0c6e-492a-84b5-097a3defc481\") " pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.655525 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.816507 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.902691 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8glj\" (UniqueName: \"kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj\") pod \"866497a0-a152-4937-b43e-45aec58b4fea\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.902770 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs\") pod \"866497a0-a152-4937-b43e-45aec58b4fea\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.902822 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data\") pod \"866497a0-a152-4937-b43e-45aec58b4fea\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.903518 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs" (OuterVolumeSpecName: "logs") pod "866497a0-a152-4937-b43e-45aec58b4fea" (UID: "866497a0-a152-4937-b43e-45aec58b4fea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.903899 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom\") pod \"866497a0-a152-4937-b43e-45aec58b4fea\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.903951 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle\") pod \"866497a0-a152-4937-b43e-45aec58b4fea\" (UID: \"866497a0-a152-4937-b43e-45aec58b4fea\") " Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.904509 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/866497a0-a152-4937-b43e-45aec58b4fea-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.908489 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "866497a0-a152-4937-b43e-45aec58b4fea" (UID: "866497a0-a152-4937-b43e-45aec58b4fea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.909152 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj" (OuterVolumeSpecName: "kube-api-access-t8glj") pod "866497a0-a152-4937-b43e-45aec58b4fea" (UID: "866497a0-a152-4937-b43e-45aec58b4fea"). InnerVolumeSpecName "kube-api-access-t8glj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.936867 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "866497a0-a152-4937-b43e-45aec58b4fea" (UID: "866497a0-a152-4937-b43e-45aec58b4fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.957274 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data" (OuterVolumeSpecName: "config-data") pod "866497a0-a152-4937-b43e-45aec58b4fea" (UID: "866497a0-a152-4937-b43e-45aec58b4fea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:18 crc kubenswrapper[4777]: I0216 21:59:18.968865 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.002899 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.005557 4777 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.005585 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.005594 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8glj\" (UniqueName: \"kubernetes.io/projected/866497a0-a152-4937-b43e-45aec58b4fea-kube-api-access-t8glj\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.005602 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/866497a0-a152-4937-b43e-45aec58b4fea-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.052844 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.053117 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="dnsmasq-dns" containerID="cri-o://3a2a013017b489637009a466c9634556c144ec7dc1e1cfd8df3a0a6e8d63c457" gracePeriod=10 Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.167272 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57449bfc86-nk7r7"] Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.277536 4777 generic.go:334] "Generic (PLEG): container finished" podID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerID="3a2a013017b489637009a466c9634556c144ec7dc1e1cfd8df3a0a6e8d63c457" exitCode=0 Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.277598 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" event={"ID":"e59a46ad-39b7-4fdd-b942-aa753716c6a8","Type":"ContainerDied","Data":"3a2a013017b489637009a466c9634556c144ec7dc1e1cfd8df3a0a6e8d63c457"} Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.286472 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57449bfc86-nk7r7" event={"ID":"73dc4ce5-0c6e-492a-84b5-097a3defc481","Type":"ContainerStarted","Data":"28c901e6f87cc51c52ba701846ea51791423952942640942779ea2aa08278ec4"} Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.288576 4777 generic.go:334] "Generic (PLEG): container finished" podID="866497a0-a152-4937-b43e-45aec58b4fea" containerID="7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede" exitCode=0 Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.288598 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerDied","Data":"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede"} Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.288757 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-586895bbf8-5lcdq" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.288859 4777 scope.go:117] "RemoveContainer" containerID="7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.291052 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-586895bbf8-5lcdq" event={"ID":"866497a0-a152-4937-b43e-45aec58b4fea","Type":"ContainerDied","Data":"d4cd18bcc1bdbb77fbfd377acc88c191900e364eac6b328e3ad21b967e2675d0"} Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.314203 4777 scope.go:117] "RemoveContainer" containerID="3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.337188 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.353992 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.376240 4777 scope.go:117] "RemoveContainer" containerID="7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede" Feb 16 21:59:19 crc kubenswrapper[4777]: E0216 21:59:19.382018 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede\": container with ID starting with 7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede not found: ID does not exist" containerID="7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.382062 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede"} err="failed to get container status \"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede\": rpc error: code = NotFound desc = could not find container \"7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede\": container with ID starting with 7f1a0e250b309bfb271a4ac4c4ebb348c411f0307e7f9578fea80828072b9ede not found: ID does not exist" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.382109 4777 scope.go:117] "RemoveContainer" containerID="3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192" Feb 16 21:59:19 crc kubenswrapper[4777]: E0216 21:59:19.387986 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192\": container with ID starting with 3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192 not found: ID does not exist" containerID="3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.388053 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192"} err="failed to get container status \"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192\": rpc error: code = NotFound desc = could not find container \"3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192\": container with ID starting with 3b66a062655d6b93faa40752c9486f67170e2790bf7b616215bc90b582d4e192 not found: ID does not exist" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.425959 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-586895bbf8-5lcdq"] Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.441066 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.695038 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733609 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733668 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733838 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733927 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733959 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgcn\" (UniqueName: \"kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.733997 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb\") pod \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\" (UID: \"e59a46ad-39b7-4fdd-b942-aa753716c6a8\") " Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.740463 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn" (OuterVolumeSpecName: "kube-api-access-xlgcn") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "kube-api-access-xlgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.798106 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.799300 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.800575 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config" (OuterVolumeSpecName: "config") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.819824 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.821243 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e59a46ad-39b7-4fdd-b942-aa753716c6a8" (UID: "e59a46ad-39b7-4fdd-b942-aa753716c6a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836065 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836089 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836099 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgcn\" (UniqueName: \"kubernetes.io/projected/e59a46ad-39b7-4fdd-b942-aa753716c6a8-kube-api-access-xlgcn\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836110 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836119 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:19 crc kubenswrapper[4777]: I0216 21:59:19.836127 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59a46ad-39b7-4fdd-b942-aa753716c6a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.193785 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="866497a0-a152-4937-b43e-45aec58b4fea" path="/var/lib/kubelet/pods/866497a0-a152-4937-b43e-45aec58b4fea/volumes" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.299139 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57449bfc86-nk7r7" event={"ID":"73dc4ce5-0c6e-492a-84b5-097a3defc481","Type":"ContainerStarted","Data":"abd34b65e2f202b9ef3e544e9416e400b93d5ba08b28f8ae88d178438e3c0099"} Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.299185 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57449bfc86-nk7r7" event={"ID":"73dc4ce5-0c6e-492a-84b5-097a3defc481","Type":"ContainerStarted","Data":"46bcccb3a1a50a4fc3d2ded845b3e2f2344c9758f651444dacf95f0d83d021c6"} Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.299244 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.299273 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.303733 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.303753 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-z7mrk" event={"ID":"e59a46ad-39b7-4fdd-b942-aa753716c6a8","Type":"ContainerDied","Data":"454b505091e6a44f60e1bce88a4764d3d9dab27440a6a9809a53816342ef4425"} Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.303817 4777 scope.go:117] "RemoveContainer" containerID="3a2a013017b489637009a466c9634556c144ec7dc1e1cfd8df3a0a6e8d63c457" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.304026 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="cinder-scheduler" containerID="cri-o://424c490251c5b3357734ae2216cf462f87c1b460d26b3891b1b1761e8527925f" gracePeriod=30 Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.304044 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="probe" containerID="cri-o://7d5423d6ea4f6db4862db96f31540776e22f749e0416cd3800f9b5c30f477978" gracePeriod=30 Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.331654 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57449bfc86-nk7r7" podStartSLOduration=2.331636703 podStartE2EDuration="2.331636703s" podCreationTimestamp="2026-02-16 21:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:20.326859699 +0000 UTC m=+1280.909360801" watchObservedRunningTime="2026-02-16 21:59:20.331636703 +0000 UTC m=+1280.914137795" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.346766 4777 scope.go:117] "RemoveContainer" containerID="050e8a4251be5be6fb141884a7f5c742176942bffeff1b64fa84b6483b125412" Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.369136 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:59:20 crc kubenswrapper[4777]: I0216 21:59:20.384488 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-z7mrk"] Feb 16 21:59:21 crc kubenswrapper[4777]: I0216 21:59:21.147106 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 16 21:59:21 crc kubenswrapper[4777]: I0216 21:59:21.331846 4777 generic.go:334] "Generic (PLEG): container finished" podID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerID="7d5423d6ea4f6db4862db96f31540776e22f749e0416cd3800f9b5c30f477978" exitCode=0 Feb 16 21:59:21 crc kubenswrapper[4777]: I0216 21:59:21.333530 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerDied","Data":"7d5423d6ea4f6db4862db96f31540776e22f749e0416cd3800f9b5c30f477978"} Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.011334 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084563 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084654 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084828 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084875 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084901 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb9vk\" (UniqueName: \"kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084920 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.084984 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs\") pod \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\" (UID: \"1bb43a06-461f-46ca-b3ed-419ee64ea40f\") " Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.091023 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.098223 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk" (OuterVolumeSpecName: "kube-api-access-hb9vk") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "kube-api-access-hb9vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.158058 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.161395 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.171849 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config" (OuterVolumeSpecName: "config") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.172079 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186387 4777 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186432 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186444 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb9vk\" (UniqueName: \"kubernetes.io/projected/1bb43a06-461f-46ca-b3ed-419ee64ea40f-kube-api-access-hb9vk\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186459 4777 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186472 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.186485 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.195853 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" path="/var/lib/kubelet/pods/e59a46ad-39b7-4fdd-b942-aa753716c6a8/volumes" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.207178 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1bb43a06-461f-46ca-b3ed-419ee64ea40f" (UID: "1bb43a06-461f-46ca-b3ed-419ee64ea40f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.288705 4777 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bb43a06-461f-46ca-b3ed-419ee64ea40f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.356505 4777 generic.go:334] "Generic (PLEG): container finished" podID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerID="f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05" exitCode=0 Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.356550 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerDied","Data":"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05"} Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.356581 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86f96cb9-gdgkm" event={"ID":"1bb43a06-461f-46ca-b3ed-419ee64ea40f","Type":"ContainerDied","Data":"0c26b31254dd8e5a9aae5b289e3dadc106471bb63878ed7e39e61f06a8203bfb"} Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.356600 4777 scope.go:117] "RemoveContainer" containerID="1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.356596 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86f96cb9-gdgkm" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.390012 4777 scope.go:117] "RemoveContainer" containerID="f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.413580 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.422416 4777 scope.go:117] "RemoveContainer" containerID="1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a" Feb 16 21:59:22 crc kubenswrapper[4777]: E0216 21:59:22.423007 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a\": container with ID starting with 1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a not found: ID does not exist" containerID="1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.423091 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a"} err="failed to get container status \"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a\": rpc error: code = NotFound desc = could not find container \"1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a\": container with ID starting with 1cffa35c78714c13149d504d39e4ea6a97356096a8dc7dd5bcdf02424d5cf35a not found: ID does not exist" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.423136 4777 scope.go:117] "RemoveContainer" containerID="f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05" Feb 16 21:59:22 crc kubenswrapper[4777]: E0216 21:59:22.424068 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05\": container with ID starting with f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05 not found: ID does not exist" containerID="f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.424133 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05"} err="failed to get container status \"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05\": rpc error: code = NotFound desc = could not find container \"f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05\": container with ID starting with f0a5f98781e8741c23e6923bf9e1791b063f3ece8c70fe53a9ca53dc80741f05 not found: ID does not exist" Feb 16 21:59:22 crc kubenswrapper[4777]: I0216 21:59:22.424371 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f86f96cb9-gdgkm"] Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.202658 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" path="/var/lib/kubelet/pods/1bb43a06-461f-46ca-b3ed-419ee64ea40f/volumes" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.379871 4777 generic.go:334] "Generic (PLEG): container finished" podID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerID="424c490251c5b3357734ae2216cf462f87c1b460d26b3891b1b1761e8527925f" exitCode=0 Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.380020 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerDied","Data":"424c490251c5b3357734ae2216cf462f87c1b460d26b3891b1b1761e8527925f"} Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.476562 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-57454d8655-v6bgf" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.871325 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966162 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966265 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966307 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966353 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966368 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scnjp\" (UniqueName: \"kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966555 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.966661 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle\") pod \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\" (UID: \"60ace3f8-ed39-4b0f-9d38-e80ec56f0874\") " Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.967482 4777 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.971736 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts" (OuterVolumeSpecName: "scripts") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.972296 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:24 crc kubenswrapper[4777]: I0216 21:59:24.973404 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp" (OuterVolumeSpecName: "kube-api-access-scnjp") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "kube-api-access-scnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.058707 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.069860 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.069921 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scnjp\" (UniqueName: \"kubernetes.io/projected/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-kube-api-access-scnjp\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.069936 4777 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.069948 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.112627 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data" (OuterVolumeSpecName: "config-data") pod "60ace3f8-ed39-4b0f-9d38-e80ec56f0874" (UID: "60ace3f8-ed39-4b0f-9d38-e80ec56f0874"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.171772 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ace3f8-ed39-4b0f-9d38-e80ec56f0874-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.183165 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.391066 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"60ace3f8-ed39-4b0f-9d38-e80ec56f0874","Type":"ContainerDied","Data":"7404aa1bea0ee2b6b3cda7dd60fd23a44420edc6f355ad067a5f9379ecef4d4b"} Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.391146 4777 scope.go:117] "RemoveContainer" containerID="7d5423d6ea4f6db4862db96f31540776e22f749e0416cd3800f9b5c30f477978" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.391115 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.433867 4777 scope.go:117] "RemoveContainer" containerID="424c490251c5b3357734ae2216cf462f87c1b460d26b3891b1b1761e8527925f" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.444690 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.458388 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.470365 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471553 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-api" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471576 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-api" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471595 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471604 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471632 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="dnsmasq-dns" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471640 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="dnsmasq-dns" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471654 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="init" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471662 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="init" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471681 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="probe" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471689 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="probe" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471698 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="cinder-scheduler" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471706 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="cinder-scheduler" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471736 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471745 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" Feb 16 21:59:25 crc kubenswrapper[4777]: E0216 21:59:25.471765 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api-log" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.471773 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api-log" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472009 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api-log" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472027 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="866497a0-a152-4937-b43e-45aec58b4fea" containerName="barbican-api" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472046 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="probe" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472063 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-api" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472083 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" containerName="cinder-scheduler" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472093 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb43a06-461f-46ca-b3ed-419ee64ea40f" containerName="neutron-httpd" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.472107 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59a46ad-39b7-4fdd-b942-aa753716c6a8" containerName="dnsmasq-dns" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.473441 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.481889 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.486649 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.579063 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.579417 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.579534 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.579622 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee3374-a9ec-4419-a498-aee6852b4ade-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.579851 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-scripts\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.580040 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpm45\" (UniqueName: \"kubernetes.io/projected/42ee3374-a9ec-4419-a498-aee6852b4ade-kube-api-access-rpm45\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.682266 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-scripts\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.682881 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpm45\" (UniqueName: \"kubernetes.io/projected/42ee3374-a9ec-4419-a498-aee6852b4ade-kube-api-access-rpm45\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.683114 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.683197 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.683288 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.683355 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee3374-a9ec-4419-a498-aee6852b4ade-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.683548 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42ee3374-a9ec-4419-a498-aee6852b4ade-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.686949 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-scripts\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.688412 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.689011 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.699829 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ee3374-a9ec-4419-a498-aee6852b4ade-config-data\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.711782 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpm45\" (UniqueName: \"kubernetes.io/projected/42ee3374-a9ec-4419-a498-aee6852b4ade-kube-api-access-rpm45\") pod \"cinder-scheduler-0\" (UID: \"42ee3374-a9ec-4419-a498-aee6852b4ade\") " pod="openstack/cinder-scheduler-0" Feb 16 21:59:25 crc kubenswrapper[4777]: I0216 21:59:25.833183 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 16 21:59:26 crc kubenswrapper[4777]: I0216 21:59:26.199381 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ace3f8-ed39-4b0f-9d38-e80ec56f0874" path="/var/lib/kubelet/pods/60ace3f8-ed39-4b0f-9d38-e80ec56f0874/volumes" Feb 16 21:59:26 crc kubenswrapper[4777]: W0216 21:59:26.330647 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42ee3374_a9ec_4419_a498_aee6852b4ade.slice/crio-999db0c1b5038ebdab57e455cd6cf7c83b2ed43bb7c2bf9198d4767c29969121 WatchSource:0}: Error finding container 999db0c1b5038ebdab57e455cd6cf7c83b2ed43bb7c2bf9198d4767c29969121: Status 404 returned error can't find the container with id 999db0c1b5038ebdab57e455cd6cf7c83b2ed43bb7c2bf9198d4767c29969121 Feb 16 21:59:26 crc kubenswrapper[4777]: I0216 21:59:26.331630 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 16 21:59:26 crc kubenswrapper[4777]: I0216 21:59:26.402368 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42ee3374-a9ec-4419-a498-aee6852b4ade","Type":"ContainerStarted","Data":"999db0c1b5038ebdab57e455cd6cf7c83b2ed43bb7c2bf9198d4767c29969121"} Feb 16 21:59:27 crc kubenswrapper[4777]: I0216 21:59:27.413601 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42ee3374-a9ec-4419-a498-aee6852b4ade","Type":"ContainerStarted","Data":"a6b2d1800e0f10629c74964daf1815033b93aa01c6bb3ac93f51d4dbd79b4c98"} Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.102656 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.105376 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.107506 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.108986 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.109198 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-k26xv" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.113299 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.164382 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsj6\" (UniqueName: \"kubernetes.io/projected/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-kube-api-access-kgsj6\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.164518 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.164615 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.164642 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.267211 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.268623 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.268708 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.269066 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsj6\" (UniqueName: \"kubernetes.io/projected/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-kube-api-access-kgsj6\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.269132 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.275052 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.275698 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.285834 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsj6\" (UniqueName: \"kubernetes.io/projected/5ae5d18f-9104-4d22-ab01-f97681e4bbc8-kube-api-access-kgsj6\") pod \"openstackclient\" (UID: \"5ae5d18f-9104-4d22-ab01-f97681e4bbc8\") " pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.424768 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42ee3374-a9ec-4419-a498-aee6852b4ade","Type":"ContainerStarted","Data":"f3385202d7f03789079264bfc1ea586aa895e255594517af80704842931e07d5"} Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.449229 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.453004 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.452980908 podStartE2EDuration="3.452980908s" podCreationTimestamp="2026-02-16 21:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:28.446476606 +0000 UTC m=+1289.028977728" watchObservedRunningTime="2026-02-16 21:59:28.452980908 +0000 UTC m=+1289.035482010" Feb 16 21:59:28 crc kubenswrapper[4777]: I0216 21:59:28.918393 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 16 21:59:29 crc kubenswrapper[4777]: I0216 21:59:29.485845 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5ae5d18f-9104-4d22-ab01-f97681e4bbc8","Type":"ContainerStarted","Data":"2eae7628646389fcd8b71b4fefd4ac28628bc8aa3f6f7add6069aca8698b238c"} Feb 16 21:59:30 crc kubenswrapper[4777]: I0216 21:59:30.833742 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.129925 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-69c8b968f9-qldk7"] Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.131917 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.135228 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.135358 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.135884 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.155044 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69c8b968f9-qldk7"] Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166658 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-etc-swift\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166724 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-internal-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166759 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-config-data\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166829 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-combined-ca-bundle\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166853 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-log-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166873 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhlp\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-kube-api-access-7hhlp\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166906 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-public-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.166923 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-run-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.279640 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-combined-ca-bundle\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.279833 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-log-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280541 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhlp\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-kube-api-access-7hhlp\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280536 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-log-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280645 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-public-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280678 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-run-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280932 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-etc-swift\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.280981 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-internal-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.281027 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-config-data\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.281202 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9159fa9f-578f-4292-bf32-8acbecf95c58-run-httpd\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.286691 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-combined-ca-bundle\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.286849 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-internal-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.287574 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-config-data\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.288218 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-etc-swift\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.300615 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159fa9f-578f-4292-bf32-8acbecf95c58-public-tls-certs\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.302395 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhlp\" (UniqueName: \"kubernetes.io/projected/9159fa9f-578f-4292-bf32-8acbecf95c58-kube-api-access-7hhlp\") pod \"swift-proxy-69c8b968f9-qldk7\" (UID: \"9159fa9f-578f-4292-bf32-8acbecf95c58\") " pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:32 crc kubenswrapper[4777]: I0216 21:59:32.459911 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.076287 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69c8b968f9-qldk7"] Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.473102 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.473437 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="proxy-httpd" containerID="cri-o://9f7c3422cde3af1689c80a14a278107ca3de3842eb8aedde6982c631aed16a53" gracePeriod=30 Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.473502 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="sg-core" containerID="cri-o://78c979207458f09fc38e1370ee3ffac5e18deaba45567a1d7e6573cb611015da" gracePeriod=30 Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.473805 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-central-agent" containerID="cri-o://25de2aaa8c60033eabacde3caf311d985aeed02bed737d7ce787d66bd84b2410" gracePeriod=30 Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.473868 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-notification-agent" containerID="cri-o://982413fbef394604ecd4fd9c043b24a9c8d514c433fa28f79f33b633c5012ffb" gracePeriod=30 Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.481691 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 16 21:59:33 crc kubenswrapper[4777]: I0216 21:59:33.562979 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69c8b968f9-qldk7" event={"ID":"9159fa9f-578f-4292-bf32-8acbecf95c58","Type":"ContainerStarted","Data":"2f16a845247adf845ab36ffcb3e7b3be78625962940bd4c07f36d9e68efbdfc9"} Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576451 4777 generic.go:334] "Generic (PLEG): container finished" podID="e8858652-305e-4b14-bbed-39a09b77d30f" containerID="9f7c3422cde3af1689c80a14a278107ca3de3842eb8aedde6982c631aed16a53" exitCode=0 Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576861 4777 generic.go:334] "Generic (PLEG): container finished" podID="e8858652-305e-4b14-bbed-39a09b77d30f" containerID="78c979207458f09fc38e1370ee3ffac5e18deaba45567a1d7e6573cb611015da" exitCode=2 Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576563 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerDied","Data":"9f7c3422cde3af1689c80a14a278107ca3de3842eb8aedde6982c631aed16a53"} Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576923 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerDied","Data":"78c979207458f09fc38e1370ee3ffac5e18deaba45567a1d7e6573cb611015da"} Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576946 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerDied","Data":"25de2aaa8c60033eabacde3caf311d985aeed02bed737d7ce787d66bd84b2410"} Feb 16 21:59:34 crc kubenswrapper[4777]: I0216 21:59:34.576875 4777 generic.go:334] "Generic (PLEG): container finished" podID="e8858652-305e-4b14-bbed-39a09b77d30f" containerID="25de2aaa8c60033eabacde3caf311d985aeed02bed737d7ce787d66bd84b2410" exitCode=0 Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.070279 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.437805 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tgbml"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.439279 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.464275 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tgbml"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.466641 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhwj\" (UniqueName: \"kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.466697 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.535428 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1962-account-create-update-sdcsp"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.537073 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.541973 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.549084 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1962-account-create-update-sdcsp"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.568432 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.568520 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxkgl\" (UniqueName: \"kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.568576 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhwj\" (UniqueName: \"kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.568595 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.569320 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.598689 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhwj\" (UniqueName: \"kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj\") pod \"nova-api-db-create-tgbml\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.625396 4777 generic.go:334] "Generic (PLEG): container finished" podID="e8858652-305e-4b14-bbed-39a09b77d30f" containerID="982413fbef394604ecd4fd9c043b24a9c8d514c433fa28f79f33b633c5012ffb" exitCode=0 Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.625444 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerDied","Data":"982413fbef394604ecd4fd9c043b24a9c8d514c433fa28f79f33b633c5012ffb"} Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.637300 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5dkfq"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.639032 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.646110 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5dkfq"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.661505 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.180:3000/\": dial tcp 10.217.0.180:3000: connect: connection refused" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.671356 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8vhz\" (UniqueName: \"kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.671445 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxkgl\" (UniqueName: \"kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.671520 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.671591 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.672825 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.691356 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxkgl\" (UniqueName: \"kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl\") pod \"nova-api-1962-account-create-update-sdcsp\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.735325 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dmsfj"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.738139 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.751094 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dmsfj"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.756932 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.774601 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.774739 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.774788 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8vhz\" (UniqueName: \"kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.774873 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88rz\" (UniqueName: \"kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.782432 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.788791 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-dbc0-account-create-update-l6rwv"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.790024 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.791792 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.793965 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8vhz\" (UniqueName: \"kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz\") pod \"nova-cell0-db-create-5dkfq\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.806223 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dbc0-account-create-update-l6rwv"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.870364 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.877195 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.877421 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.877543 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88rz\" (UniqueName: \"kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.877655 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlmd\" (UniqueName: \"kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.877903 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.893076 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88rz\" (UniqueName: \"kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz\") pod \"nova-cell1-db-create-dmsfj\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.940244 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f289-account-create-update-79rg7"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.942901 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.947298 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.957315 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f289-account-create-update-79rg7"] Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.979089 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlmd\" (UniqueName: \"kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.979179 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.979209 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.980183 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.983424 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr62f\" (UniqueName: \"kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.984009 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:36 crc kubenswrapper[4777]: I0216 21:59:36.995098 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlmd\" (UniqueName: \"kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd\") pod \"nova-cell0-dbc0-account-create-update-l6rwv\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.084011 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.085587 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.085746 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr62f\" (UniqueName: \"kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.086505 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.101949 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr62f\" (UniqueName: \"kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f\") pod \"nova-cell1-f289-account-create-update-79rg7\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.153414 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.259437 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.682422 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.682925 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-log" containerID="cri-o://3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef" gracePeriod=30 Feb 16 21:59:37 crc kubenswrapper[4777]: I0216 21:59:37.682994 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-httpd" containerID="cri-o://3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1" gracePeriod=30 Feb 16 21:59:38 crc kubenswrapper[4777]: I0216 21:59:38.458656 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:38 crc kubenswrapper[4777]: I0216 21:59:38.459174 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-log" containerID="cri-o://333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858" gracePeriod=30 Feb 16 21:59:38 crc kubenswrapper[4777]: I0216 21:59:38.459257 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-httpd" containerID="cri-o://daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296" gracePeriod=30 Feb 16 21:59:38 crc kubenswrapper[4777]: I0216 21:59:38.674560 4777 generic.go:334] "Generic (PLEG): container finished" podID="728c28d7-febe-4196-85fe-55e8278e8899" containerID="3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef" exitCode=143 Feb 16 21:59:38 crc kubenswrapper[4777]: I0216 21:59:38.674607 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerDied","Data":"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef"} Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.187647 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.187871 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0eaee64e-a445-4d25-9781-d7067c0841f8" containerName="kube-state-metrics" containerID="cri-o://99cb9b349c1b4a155720a37143ddf67bf7fdbc6c6432d14310d058fdd1954bd8" gracePeriod=30 Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.717014 4777 generic.go:334] "Generic (PLEG): container finished" podID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerID="333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858" exitCode=143 Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.717276 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerDied","Data":"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858"} Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.736987 4777 generic.go:334] "Generic (PLEG): container finished" podID="0eaee64e-a445-4d25-9781-d7067c0841f8" containerID="99cb9b349c1b4a155720a37143ddf67bf7fdbc6c6432d14310d058fdd1954bd8" exitCode=2 Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.737025 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0eaee64e-a445-4d25-9781-d7067c0841f8","Type":"ContainerDied","Data":"99cb9b349c1b4a155720a37143ddf67bf7fdbc6c6432d14310d058fdd1954bd8"} Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.856163 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.948776 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.948923 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.948954 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scm8t\" (UniqueName: \"kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.949001 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.949020 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.949097 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.949186 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd\") pod \"e8858652-305e-4b14-bbed-39a09b77d30f\" (UID: \"e8858652-305e-4b14-bbed-39a09b77d30f\") " Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.950194 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.953282 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.965428 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts" (OuterVolumeSpecName: "scripts") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:39 crc kubenswrapper[4777]: I0216 21:59:39.965631 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t" (OuterVolumeSpecName: "kube-api-access-scm8t") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "kube-api-access-scm8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.059528 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.059557 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.059565 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8858652-305e-4b14-bbed-39a09b77d30f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.059574 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scm8t\" (UniqueName: \"kubernetes.io/projected/e8858652-305e-4b14-bbed-39a09b77d30f-kube-api-access-scm8t\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.163469 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.168042 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.205197 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.235255 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.251985 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data" (OuterVolumeSpecName: "config-data") pod "e8858652-305e-4b14-bbed-39a09b77d30f" (UID: "e8858652-305e-4b14-bbed-39a09b77d30f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.263167 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7697\" (UniqueName: \"kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697\") pod \"0eaee64e-a445-4d25-9781-d7067c0841f8\" (UID: \"0eaee64e-a445-4d25-9781-d7067c0841f8\") " Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.263909 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.263932 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.263942 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8858652-305e-4b14-bbed-39a09b77d30f-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.266936 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697" (OuterVolumeSpecName: "kube-api-access-d7697") pod "0eaee64e-a445-4d25-9781-d7067c0841f8" (UID: "0eaee64e-a445-4d25-9781-d7067c0841f8"). InnerVolumeSpecName "kube-api-access-d7697". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.365919 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7697\" (UniqueName: \"kubernetes.io/projected/0eaee64e-a445-4d25-9781-d7067c0841f8-kube-api-access-d7697\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.486076 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dbc0-account-create-update-l6rwv"] Feb 16 21:59:40 crc kubenswrapper[4777]: W0216 21:59:40.505942 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a1218f6_7d46_4c08_be98_e27104f96caf.slice/crio-6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55 WatchSource:0}: Error finding container 6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55: Status 404 returned error can't find the container with id 6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55 Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.507370 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1962-account-create-update-sdcsp"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.530284 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5dkfq"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.563087 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dmsfj"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.593637 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tgbml"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.641598 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f289-account-create-update-79rg7"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.759337 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5ae5d18f-9104-4d22-ab01-f97681e4bbc8","Type":"ContainerStarted","Data":"8a3bb1fd2b90945bbc8274d55692ff041fa8f78d98cccdeea3addf9631835a92"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.761401 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f289-account-create-update-79rg7" event={"ID":"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc","Type":"ContainerStarted","Data":"de88439188890274a1df4e54ae1a3d3eb517f74e7e26baef15c3cf8578ef56be"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.762640 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5dkfq" event={"ID":"e22e9b3b-7b09-4704-a1aa-b219e416360c","Type":"ContainerStarted","Data":"ef051d2d31bc5df1f6e70c0db6ced1a700d22dd6b14cacd4987e0c45cdf944bf"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.763852 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.764619 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0eaee64e-a445-4d25-9781-d7067c0841f8","Type":"ContainerDied","Data":"9c21113bf82f86335dda62a1e5deddf695b1ab52f0f55bc93caf86b942759cf0"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.764650 4777 scope.go:117] "RemoveContainer" containerID="99cb9b349c1b4a155720a37143ddf67bf7fdbc6c6432d14310d058fdd1954bd8" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.770691 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8858652-305e-4b14-bbed-39a09b77d30f","Type":"ContainerDied","Data":"873a356f3d9b329d5af27e348552b6566f7a8864aef613dfba89cacc48642fcd"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.770810 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.782210 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.231704115 podStartE2EDuration="12.782192333s" podCreationTimestamp="2026-02-16 21:59:28 +0000 UTC" firstStartedPulling="2026-02-16 21:59:28.932835948 +0000 UTC m=+1289.515337050" lastFinishedPulling="2026-02-16 21:59:39.483324166 +0000 UTC m=+1300.065825268" observedRunningTime="2026-02-16 21:59:40.781225126 +0000 UTC m=+1301.363726238" watchObservedRunningTime="2026-02-16 21:59:40.782192333 +0000 UTC m=+1301.364693435" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.783469 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" event={"ID":"df482342-2142-4778-9751-7f8c8daaccd8","Type":"ContainerStarted","Data":"d87aad1e3211ecfbd3c657c7af56f4624f98a9f866a42caff7f9121740598e8c"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.789597 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmsfj" event={"ID":"54c47b5e-f997-4751-9fe6-e35d904a7fea","Type":"ContainerStarted","Data":"e50139bc34687a911910f899bf8d8d0c9ed61544f9bd3a6f38e44729b01c235a"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.791025 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1962-account-create-update-sdcsp" event={"ID":"95770934-2a37-4fc6-b3e4-5ffd1e429f4b","Type":"ContainerStarted","Data":"089b6bdbfd3a3887cf92f9d60515f7698fdad0013508b86286429ff860cdc55d"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.796384 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69c8b968f9-qldk7" event={"ID":"9159fa9f-578f-4292-bf32-8acbecf95c58","Type":"ContainerStarted","Data":"e7ac6ad72e76bf4f9760a7229fc1c0ae1b46e13ba3aa2baa9c8244a416da8fd8"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.796418 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69c8b968f9-qldk7" event={"ID":"9159fa9f-578f-4292-bf32-8acbecf95c58","Type":"ContainerStarted","Data":"3205ff160b6dd1fa51b81c78041f25b79e26f512b17c9783c55498d2ec4e97e5"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.796828 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.796952 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.797751 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tgbml" event={"ID":"5a1218f6-7d46-4c08-be98-e27104f96caf","Type":"ContainerStarted","Data":"6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55"} Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.821775 4777 scope.go:117] "RemoveContainer" containerID="9f7c3422cde3af1689c80a14a278107ca3de3842eb8aedde6982c631aed16a53" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.823829 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.844160 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.862702 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.871941 4777 scope.go:117] "RemoveContainer" containerID="78c979207458f09fc38e1370ee3ffac5e18deaba45567a1d7e6573cb611015da" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874004 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.874482 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="proxy-httpd" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874503 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="proxy-httpd" Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.874522 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-notification-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874529 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-notification-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.874540 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-central-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874546 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-central-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.874559 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaee64e-a445-4d25-9781-d7067c0841f8" containerName="kube-state-metrics" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874567 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaee64e-a445-4d25-9781-d7067c0841f8" containerName="kube-state-metrics" Feb 16 21:59:40 crc kubenswrapper[4777]: E0216 21:59:40.874604 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="sg-core" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874613 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="sg-core" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874811 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-central-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874832 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="sg-core" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874843 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="ceilometer-notification-agent" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874855 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaee64e-a445-4d25-9781-d7067c0841f8" containerName="kube-state-metrics" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.874867 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" containerName="proxy-httpd" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.875677 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.878257 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.878520 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-m7pfc" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.878662 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.883312 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.893455 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.900145 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-69c8b968f9-qldk7" podStartSLOduration=8.900125826 podStartE2EDuration="8.900125826s" podCreationTimestamp="2026-02-16 21:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:40.836378521 +0000 UTC m=+1301.418879623" watchObservedRunningTime="2026-02-16 21:59:40.900125826 +0000 UTC m=+1301.482626928" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.908706 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.911266 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.915388 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.915626 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.918102 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:40 crc kubenswrapper[4777]: I0216 21:59:40.975523 4777 scope.go:117] "RemoveContainer" containerID="982413fbef394604ecd4fd9c043b24a9c8d514c433fa28f79f33b633c5012ffb" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008351 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008428 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008462 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008493 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008520 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942jp\" (UniqueName: \"kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008534 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008561 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008579 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8snf\" (UniqueName: \"kubernetes.io/projected/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-api-access-c8snf\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008598 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008636 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.008657 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.051200 4777 scope.go:117] "RemoveContainer" containerID="25de2aaa8c60033eabacde3caf311d985aeed02bed737d7ce787d66bd84b2410" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110580 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110646 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110698 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110747 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942jp\" (UniqueName: \"kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110766 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110813 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110834 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8snf\" (UniqueName: \"kubernetes.io/projected/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-api-access-c8snf\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110851 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110909 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.110992 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.111474 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.114009 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.117904 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.118156 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.120084 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.121375 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.121508 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.122287 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.126653 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.132511 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942jp\" (UniqueName: \"kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp\") pod \"ceilometer-0\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.133547 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8snf\" (UniqueName: \"kubernetes.io/projected/f49eb4b3-3728-4b56-ab7d-00460813ed7c-kube-api-access-c8snf\") pod \"kube-state-metrics-0\" (UID: \"f49eb4b3-3728-4b56-ab7d-00460813ed7c\") " pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.325458 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.325731 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.660862 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.661195 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.743362 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.783062 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863024 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863206 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863363 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863468 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99zs\" (UniqueName: \"kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863572 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863683 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863842 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.863980 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data\") pod \"728c28d7-febe-4196-85fe-55e8278e8899\" (UID: \"728c28d7-febe-4196-85fe-55e8278e8899\") " Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.880808 4777 generic.go:334] "Generic (PLEG): container finished" podID="e22e9b3b-7b09-4704-a1aa-b219e416360c" containerID="bc31cbc847db042c93eb6192ca824981d5348d547d3db2fdbfbeb31c94b60cb3" exitCode=0 Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.880895 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5dkfq" event={"ID":"e22e9b3b-7b09-4704-a1aa-b219e416360c","Type":"ContainerDied","Data":"bc31cbc847db042c93eb6192ca824981d5348d547d3db2fdbfbeb31c94b60cb3"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.882083 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:41 crc kubenswrapper[4777]: E0216 21:59:41.884283 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c47b5e_f997_4751_9fe6_e35d904a7fea.slice/crio-conmon-777a58515f8a44311bc51bc1a7a4dee89fc2bb699cd9f1aa52226d197b03d0dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode22e9b3b_7b09_4704_a1aa_b219e416360c.slice/crio-conmon-bc31cbc847db042c93eb6192ca824981d5348d547d3db2fdbfbeb31c94b60cb3.scope\": RecentStats: unable to find data in memory cache]" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.890613 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs" (OuterVolumeSpecName: "logs") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.890803 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts" (OuterVolumeSpecName: "scripts") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.907162 4777 generic.go:334] "Generic (PLEG): container finished" podID="df482342-2142-4778-9751-7f8c8daaccd8" containerID="1235d5d4c6e5382d1761fa3b85dc0a06c75a0964a0b89fb30d77cca1335797c6" exitCode=0 Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.907225 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" event={"ID":"df482342-2142-4778-9751-7f8c8daaccd8","Type":"ContainerDied","Data":"1235d5d4c6e5382d1761fa3b85dc0a06c75a0964a0b89fb30d77cca1335797c6"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.908686 4777 generic.go:334] "Generic (PLEG): container finished" podID="54c47b5e-f997-4751-9fe6-e35d904a7fea" containerID="777a58515f8a44311bc51bc1a7a4dee89fc2bb699cd9f1aa52226d197b03d0dc" exitCode=0 Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.908742 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmsfj" event={"ID":"54c47b5e-f997-4751-9fe6-e35d904a7fea","Type":"ContainerDied","Data":"777a58515f8a44311bc51bc1a7a4dee89fc2bb699cd9f1aa52226d197b03d0dc"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.946751 4777 generic.go:334] "Generic (PLEG): container finished" podID="95770934-2a37-4fc6-b3e4-5ffd1e429f4b" containerID="bef5da1ad3c47056bea28d72ac12503974c64db3a6f5595d2a02caa0fa368e65" exitCode=0 Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.946909 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1962-account-create-update-sdcsp" event={"ID":"95770934-2a37-4fc6-b3e4-5ffd1e429f4b","Type":"ContainerDied","Data":"bef5da1ad3c47056bea28d72ac12503974c64db3a6f5595d2a02caa0fa368e65"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.967053 4777 generic.go:334] "Generic (PLEG): container finished" podID="728c28d7-febe-4196-85fe-55e8278e8899" containerID="3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1" exitCode=0 Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.967134 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerDied","Data":"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.967163 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"728c28d7-febe-4196-85fe-55e8278e8899","Type":"ContainerDied","Data":"bc1b5d56e9c3889157af4be45f9afef1946656464f94a369f0548730a23191c5"} Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.967181 4777 scope.go:117] "RemoveContainer" containerID="3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.967353 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:59:41 crc kubenswrapper[4777]: I0216 21:59:41.970337 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:41.999320 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:41.999626 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/728c28d7-febe-4196-85fe-55e8278e8899-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:41.994290 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:41.982207 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs" (OuterVolumeSpecName: "kube-api-access-x99zs") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "kube-api-access-x99zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:41.990407 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4" (OuterVolumeSpecName: "glance") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "pvc-94c2b3e1-91fc-479b-9144-fb21290011a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.025840 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.035295 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.046776 4777 generic.go:334] "Generic (PLEG): container finished" podID="095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" containerID="1ecb2e533d2619bf146c9f16fc7067bb6942f1d1dfb3d90bad41d0b0787da01a" exitCode=0 Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.046868 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f289-account-create-update-79rg7" event={"ID":"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc","Type":"ContainerDied","Data":"1ecb2e533d2619bf146c9f16fc7067bb6942f1d1dfb3d90bad41d0b0787da01a"} Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.051800 4777 generic.go:334] "Generic (PLEG): container finished" podID="5a1218f6-7d46-4c08-be98-e27104f96caf" containerID="2f01910c40c0fdb6027449b04396c1157e522b9ec02b91bbdea84270de3cebf6" exitCode=0 Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.052225 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data" (OuterVolumeSpecName: "config-data") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.052306 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tgbml" event={"ID":"5a1218f6-7d46-4c08-be98-e27104f96caf","Type":"ContainerDied","Data":"2f01910c40c0fdb6027449b04396c1157e522b9ec02b91bbdea84270de3cebf6"} Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.092810 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728c28d7-febe-4196-85fe-55e8278e8899" (UID: "728c28d7-febe-4196-85fe-55e8278e8899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.102011 4777 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") on node \"crc\" " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.102051 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99zs\" (UniqueName: \"kubernetes.io/projected/728c28d7-febe-4196-85fe-55e8278e8899-kube-api-access-x99zs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.102067 4777 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.102080 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.102092 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728c28d7-febe-4196-85fe-55e8278e8899-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.121891 4777 scope.go:117] "RemoveContainer" containerID="3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.122066 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.153520 4777 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.153663 4777 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-94c2b3e1-91fc-479b-9144-fb21290011a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4") on node "crc" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.154228 4777 scope.go:117] "RemoveContainer" containerID="3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1" Feb 16 21:59:42 crc kubenswrapper[4777]: E0216 21:59:42.157076 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1\": container with ID starting with 3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1 not found: ID does not exist" containerID="3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.157117 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1"} err="failed to get container status \"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1\": rpc error: code = NotFound desc = could not find container \"3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1\": container with ID starting with 3ff44608b4cee7264aead64dbaed408a95972dc3e107ce21df1f07ddcddb55b1 not found: ID does not exist" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.157693 4777 scope.go:117] "RemoveContainer" containerID="3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef" Feb 16 21:59:42 crc kubenswrapper[4777]: E0216 21:59:42.159631 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef\": container with ID starting with 3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef not found: ID does not exist" containerID="3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.159658 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef"} err="failed to get container status \"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef\": rpc error: code = NotFound desc = could not find container \"3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef\": container with ID starting with 3eeb1b70a5cf231db92dad68e8785b956351f43ecbfeb6efd2e4d850f4d74cef not found: ID does not exist" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.193220 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eaee64e-a445-4d25-9781-d7067c0841f8" path="/var/lib/kubelet/pods/0eaee64e-a445-4d25-9781-d7067c0841f8/volumes" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.194429 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8858652-305e-4b14-bbed-39a09b77d30f" path="/var/lib/kubelet/pods/e8858652-305e-4b14-bbed-39a09b77d30f/volumes" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.204309 4777 reconciler_common.go:293] "Volume detached for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.300783 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.309405 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.330454 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: E0216 21:59:42.330816 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-log" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.330832 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-log" Feb 16 21:59:42 crc kubenswrapper[4777]: E0216 21:59:42.330859 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-httpd" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.330866 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-httpd" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.331033 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-log" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.331051 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="728c28d7-febe-4196-85fe-55e8278e8899" containerName="glance-httpd" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.350334 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.350435 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.353078 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.353259 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.513925 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.513980 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514005 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514067 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-config-data\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514200 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514219 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7dtv\" (UniqueName: \"kubernetes.io/projected/3dc126ff-fbc5-4c23-8a19-084e71677f29-kube-api-access-g7dtv\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514235 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-scripts\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.514278 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-logs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615421 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615463 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615515 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-config-data\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615618 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615634 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7dtv\" (UniqueName: \"kubernetes.io/projected/3dc126ff-fbc5-4c23-8a19-084e71677f29-kube-api-access-g7dtv\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615652 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-scripts\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615676 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-logs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.615722 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.616105 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.616426 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dc126ff-fbc5-4c23-8a19-084e71677f29-logs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.620247 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.620410 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.620466 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4a776c3602f9f8f9eee625c68304a3bb11e095b4c9719ede519a11464d274ec4/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.621856 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-config-data\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.627750 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.629270 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc126ff-fbc5-4c23-8a19-084e71677f29-scripts\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.630289 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7dtv\" (UniqueName: \"kubernetes.io/projected/3dc126ff-fbc5-4c23-8a19-084e71677f29-kube-api-access-g7dtv\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.647205 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.712349 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-94c2b3e1-91fc-479b-9144-fb21290011a4\") pod \"glance-default-external-api-0\" (UID: \"3dc126ff-fbc5-4c23-8a19-084e71677f29\") " pod="openstack/glance-default-external-api-0" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.818420 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.818600 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819466 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819519 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819541 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8vpv\" (UniqueName: \"kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819569 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819584 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.819640 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle\") pod \"71a9dba4-89bb-4b87-a1cf-3f031befd745\" (UID: \"71a9dba4-89bb-4b87-a1cf-3f031befd745\") " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.820591 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.820663 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs" (OuterVolumeSpecName: "logs") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.824196 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts" (OuterVolumeSpecName: "scripts") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.824298 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv" (OuterVolumeSpecName: "kube-api-access-g8vpv") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "kube-api-access-g8vpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.842026 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2" (OuterVolumeSpecName: "glance") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.847671 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.874721 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data" (OuterVolumeSpecName: "config-data") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.878559 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "71a9dba4-89bb-4b87-a1cf-3f031befd745" (UID: "71a9dba4-89bb-4b87-a1cf-3f031befd745"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923452 4777 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923510 4777 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") on node \"crc\" " Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923525 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923534 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923544 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8vpv\" (UniqueName: \"kubernetes.io/projected/71a9dba4-89bb-4b87-a1cf-3f031befd745-kube-api-access-g8vpv\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923553 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923562 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a9dba4-89bb-4b87-a1cf-3f031befd745-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.923570 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a9dba4-89bb-4b87-a1cf-3f031befd745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.947890 4777 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 16 21:59:42 crc kubenswrapper[4777]: I0216 21:59:42.948049 4777 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2") on node "crc" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.008780 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.024930 4777 reconciler_common.go:293] "Volume detached for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.065531 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerStarted","Data":"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.065927 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerStarted","Data":"0844569ded7d98e5e635b441c04e684401983b79387d3bd1ee221ac5602ed855"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.067217 4777 generic.go:334] "Generic (PLEG): container finished" podID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerID="daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296" exitCode=0 Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.067330 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerDied","Data":"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.067441 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71a9dba4-89bb-4b87-a1cf-3f031befd745","Type":"ContainerDied","Data":"de505b4cdf02bb85410143424239e3b0d421ed90d5bb5f7d240993a4586e9214"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.067529 4777 scope.go:117] "RemoveContainer" containerID="daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.067694 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.076748 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f49eb4b3-3728-4b56-ab7d-00460813ed7c","Type":"ContainerStarted","Data":"a10c4616e90c908648bd2570b51b2efc30890bd73740ac38be8b938f2ad88095"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.076798 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f49eb4b3-3728-4b56-ab7d-00460813ed7c","Type":"ContainerStarted","Data":"f97a0a3248ef5c40ce04abc5437311e463d40bffebe049664edf1ad39a12d997"} Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.077504 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.109897 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.635573412 podStartE2EDuration="3.109882416s" podCreationTimestamp="2026-02-16 21:59:40 +0000 UTC" firstStartedPulling="2026-02-16 21:59:42.025581157 +0000 UTC m=+1302.608082259" lastFinishedPulling="2026-02-16 21:59:42.499890161 +0000 UTC m=+1303.082391263" observedRunningTime="2026-02-16 21:59:43.093690012 +0000 UTC m=+1303.676191114" watchObservedRunningTime="2026-02-16 21:59:43.109882416 +0000 UTC m=+1303.692383518" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.120790 4777 scope.go:117] "RemoveContainer" containerID="333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.159431 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.169045 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.176648 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:43 crc kubenswrapper[4777]: E0216 21:59:43.177341 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-httpd" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.177389 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-httpd" Feb 16 21:59:43 crc kubenswrapper[4777]: E0216 21:59:43.177444 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-log" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.177451 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-log" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.177637 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-httpd" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.177646 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" containerName="glance-log" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.178706 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.188053 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.188323 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.188987 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.219147 4777 scope.go:117] "RemoveContainer" containerID="daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296" Feb 16 21:59:43 crc kubenswrapper[4777]: E0216 21:59:43.230451 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296\": container with ID starting with daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296 not found: ID does not exist" containerID="daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.230490 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296"} err="failed to get container status \"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296\": rpc error: code = NotFound desc = could not find container \"daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296\": container with ID starting with daf45c87d6278a87a51b8be8f2287b06457778917c5428bf822aded12450b296 not found: ID does not exist" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.230516 4777 scope.go:117] "RemoveContainer" containerID="333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858" Feb 16 21:59:43 crc kubenswrapper[4777]: E0216 21:59:43.231030 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858\": container with ID starting with 333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858 not found: ID does not exist" containerID="333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.231052 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858"} err="failed to get container status \"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858\": rpc error: code = NotFound desc = could not find container \"333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858\": container with ID starting with 333cd98f157d8ecfd2d405d55fb53d9533bdbd89772a7e908898e32e1b91c858 not found: ID does not exist" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335512 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335619 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335685 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335767 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335788 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwvc\" (UniqueName: \"kubernetes.io/projected/206cc213-04d8-40c6-befb-914351a6c1fe-kube-api-access-kwwvc\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.335813 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.336215 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.336254 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439056 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439131 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439154 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439171 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwvc\" (UniqueName: \"kubernetes.io/projected/206cc213-04d8-40c6-befb-914351a6c1fe-kube-api-access-kwwvc\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439198 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439225 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439242 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.439305 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.441792 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.442001 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/206cc213-04d8-40c6-befb-914351a6c1fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.445874 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.470743 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.472226 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.472688 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206cc213-04d8-40c6-befb-914351a6c1fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.474099 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwvc\" (UniqueName: \"kubernetes.io/projected/206cc213-04d8-40c6-befb-914351a6c1fe-kube-api-access-kwwvc\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.483766 4777 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.483849 4777 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb51e62c662de87643413e62600db5d03a822462f1c1c4cd14c3bd046c1343d0/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.585996 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7bdc5d0c-9a23-4f73-990b-9bf02810d5c2\") pod \"glance-default-internal-api-0\" (UID: \"206cc213-04d8-40c6-befb-914351a6c1fe\") " pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.818012 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.829881 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.849576 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.876626 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.901264 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.904349 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.920135 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.963967 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts\") pod \"5a1218f6-7d46-4c08-be98-e27104f96caf\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.964518 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts\") pod \"54c47b5e-f997-4751-9fe6-e35d904a7fea\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.964635 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqlmd\" (UniqueName: \"kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd\") pod \"df482342-2142-4778-9751-7f8c8daaccd8\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.964663 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvhwj\" (UniqueName: \"kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj\") pod \"5a1218f6-7d46-4c08-be98-e27104f96caf\" (UID: \"5a1218f6-7d46-4c08-be98-e27104f96caf\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.964811 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88rz\" (UniqueName: \"kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz\") pod \"54c47b5e-f997-4751-9fe6-e35d904a7fea\" (UID: \"54c47b5e-f997-4751-9fe6-e35d904a7fea\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.964843 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts\") pod \"df482342-2142-4778-9751-7f8c8daaccd8\" (UID: \"df482342-2142-4778-9751-7f8c8daaccd8\") " Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.966050 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df482342-2142-4778-9751-7f8c8daaccd8" (UID: "df482342-2142-4778-9751-7f8c8daaccd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.967479 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a1218f6-7d46-4c08-be98-e27104f96caf" (UID: "5a1218f6-7d46-4c08-be98-e27104f96caf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.967844 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54c47b5e-f997-4751-9fe6-e35d904a7fea" (UID: "54c47b5e-f997-4751-9fe6-e35d904a7fea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.974838 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz" (OuterVolumeSpecName: "kube-api-access-n88rz") pod "54c47b5e-f997-4751-9fe6-e35d904a7fea" (UID: "54c47b5e-f997-4751-9fe6-e35d904a7fea"). InnerVolumeSpecName "kube-api-access-n88rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.984617 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd" (OuterVolumeSpecName: "kube-api-access-lqlmd") pod "df482342-2142-4778-9751-7f8c8daaccd8" (UID: "df482342-2142-4778-9751-7f8c8daaccd8"). InnerVolumeSpecName "kube-api-access-lqlmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:43 crc kubenswrapper[4777]: I0216 21:59:43.991231 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj" (OuterVolumeSpecName: "kube-api-access-vvhwj") pod "5a1218f6-7d46-4c08-be98-e27104f96caf" (UID: "5a1218f6-7d46-4c08-be98-e27104f96caf"). InnerVolumeSpecName "kube-api-access-vvhwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066482 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts\") pod \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066585 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr62f\" (UniqueName: \"kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f\") pod \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\" (UID: \"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066622 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8vhz\" (UniqueName: \"kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz\") pod \"e22e9b3b-7b09-4704-a1aa-b219e416360c\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066646 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxkgl\" (UniqueName: \"kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl\") pod \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066754 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts\") pod \"e22e9b3b-7b09-4704-a1aa-b219e416360c\" (UID: \"e22e9b3b-7b09-4704-a1aa-b219e416360c\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066798 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts\") pod \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\" (UID: \"95770934-2a37-4fc6-b3e4-5ffd1e429f4b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.066932 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" (UID: "095f5bf0-5bb9-42b3-ae28-a7bef2475cfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068005 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e22e9b3b-7b09-4704-a1aa-b219e416360c" (UID: "e22e9b3b-7b09-4704-a1aa-b219e416360c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.067906 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54c47b5e-f997-4751-9fe6-e35d904a7fea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068074 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068086 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqlmd\" (UniqueName: \"kubernetes.io/projected/df482342-2142-4778-9751-7f8c8daaccd8-kube-api-access-lqlmd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068116 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvhwj\" (UniqueName: \"kubernetes.io/projected/5a1218f6-7d46-4c08-be98-e27104f96caf-kube-api-access-vvhwj\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068125 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88rz\" (UniqueName: \"kubernetes.io/projected/54c47b5e-f997-4751-9fe6-e35d904a7fea-kube-api-access-n88rz\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068134 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df482342-2142-4778-9751-7f8c8daaccd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068142 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a1218f6-7d46-4c08-be98-e27104f96caf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.068446 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95770934-2a37-4fc6-b3e4-5ffd1e429f4b" (UID: "95770934-2a37-4fc6-b3e4-5ffd1e429f4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.070747 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl" (OuterVolumeSpecName: "kube-api-access-rxkgl") pod "95770934-2a37-4fc6-b3e4-5ffd1e429f4b" (UID: "95770934-2a37-4fc6-b3e4-5ffd1e429f4b"). InnerVolumeSpecName "kube-api-access-rxkgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.071471 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz" (OuterVolumeSpecName: "kube-api-access-q8vhz") pod "e22e9b3b-7b09-4704-a1aa-b219e416360c" (UID: "e22e9b3b-7b09-4704-a1aa-b219e416360c"). InnerVolumeSpecName "kube-api-access-q8vhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.071964 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f" (OuterVolumeSpecName: "kube-api-access-fr62f") pod "095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" (UID: "095f5bf0-5bb9-42b3-ae28-a7bef2475cfc"). InnerVolumeSpecName "kube-api-access-fr62f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.102105 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" event={"ID":"df482342-2142-4778-9751-7f8c8daaccd8","Type":"ContainerDied","Data":"d87aad1e3211ecfbd3c657c7af56f4624f98a9f866a42caff7f9121740598e8c"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.102139 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87aad1e3211ecfbd3c657c7af56f4624f98a9f866a42caff7f9121740598e8c" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.102191 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dbc0-account-create-update-l6rwv" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.105050 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f289-account-create-update-79rg7" event={"ID":"095f5bf0-5bb9-42b3-ae28-a7bef2475cfc","Type":"ContainerDied","Data":"de88439188890274a1df4e54ae1a3d3eb517f74e7e26baef15c3cf8578ef56be"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.105079 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de88439188890274a1df4e54ae1a3d3eb517f74e7e26baef15c3cf8578ef56be" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.105140 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f289-account-create-update-79rg7" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.107365 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1962-account-create-update-sdcsp" event={"ID":"95770934-2a37-4fc6-b3e4-5ffd1e429f4b","Type":"ContainerDied","Data":"089b6bdbfd3a3887cf92f9d60515f7698fdad0013508b86286429ff860cdc55d"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.107404 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089b6bdbfd3a3887cf92f9d60515f7698fdad0013508b86286429ff860cdc55d" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.107460 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1962-account-create-update-sdcsp" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.115734 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tgbml" event={"ID":"5a1218f6-7d46-4c08-be98-e27104f96caf","Type":"ContainerDied","Data":"6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.115771 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1b34ae0a7f3ab7804be2fcffd51f460c26c908c1b1a6cb2bd03ceb99266f55" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.115825 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tgbml" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.130365 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5dkfq" event={"ID":"e22e9b3b-7b09-4704-a1aa-b219e416360c","Type":"ContainerDied","Data":"ef051d2d31bc5df1f6e70c0db6ced1a700d22dd6b14cacd4987e0c45cdf944bf"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.130389 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5dkfq" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.130399 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef051d2d31bc5df1f6e70c0db6ced1a700d22dd6b14cacd4987e0c45cdf944bf" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.135761 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dmsfj" event={"ID":"54c47b5e-f997-4751-9fe6-e35d904a7fea","Type":"ContainerDied","Data":"e50139bc34687a911910f899bf8d8d0c9ed61544f9bd3a6f38e44729b01c235a"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.135792 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50139bc34687a911910f899bf8d8d0c9ed61544f9bd3a6f38e44729b01c235a" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.135848 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dmsfj" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.156967 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerStarted","Data":"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.166903 4777 generic.go:334] "Generic (PLEG): container finished" podID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerID="ef4f14124dffb958ed3c2a29c02c7b07e2bbb0f1414bd61882ba057b313b2fef" exitCode=137 Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.166941 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerDied","Data":"ef4f14124dffb958ed3c2a29c02c7b07e2bbb0f1414bd61882ba057b313b2fef"} Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.170877 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22e9b3b-7b09-4704-a1aa-b219e416360c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.170903 4777 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.170913 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr62f\" (UniqueName: \"kubernetes.io/projected/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc-kube-api-access-fr62f\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.170924 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8vhz\" (UniqueName: \"kubernetes.io/projected/e22e9b3b-7b09-4704-a1aa-b219e416360c-kube-api-access-q8vhz\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.170933 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxkgl\" (UniqueName: \"kubernetes.io/projected/95770934-2a37-4fc6-b3e4-5ffd1e429f4b-kube-api-access-rxkgl\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.214407 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.216638 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a9dba4-89bb-4b87-a1cf-3f031befd745" path="/var/lib/kubelet/pods/71a9dba4-89bb-4b87-a1cf-3f031befd745/volumes" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.217838 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728c28d7-febe-4196-85fe-55e8278e8899" path="/var/lib/kubelet/pods/728c28d7-febe-4196-85fe-55e8278e8899/volumes" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374245 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374307 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374407 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374548 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374578 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmwcp\" (UniqueName: \"kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374609 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.374633 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom\") pod \"630e92a1-ae10-4fa5-a006-fac7619ca69b\" (UID: \"630e92a1-ae10-4fa5-a006-fac7619ca69b\") " Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.375584 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.375971 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs" (OuterVolumeSpecName: "logs") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.380116 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.380563 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts" (OuterVolumeSpecName: "scripts") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.388934 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp" (OuterVolumeSpecName: "kube-api-access-qmwcp") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "kube-api-access-qmwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.412136 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.444379 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data" (OuterVolumeSpecName: "config-data") pod "630e92a1-ae10-4fa5-a006-fac7619ca69b" (UID: "630e92a1-ae10-4fa5-a006-fac7619ca69b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.471767 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.476958 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.476986 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmwcp\" (UniqueName: \"kubernetes.io/projected/630e92a1-ae10-4fa5-a006-fac7619ca69b-kube-api-access-qmwcp\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.476998 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.477009 4777 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.477018 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/630e92a1-ae10-4fa5-a006-fac7619ca69b-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.477027 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630e92a1-ae10-4fa5-a006-fac7619ca69b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.477034 4777 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/630e92a1-ae10-4fa5-a006-fac7619ca69b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.622422 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 16 21:59:44 crc kubenswrapper[4777]: W0216 21:59:44.627331 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206cc213_04d8_40c6_befb_914351a6c1fe.slice/crio-df5b7e0e9e66ebfe513cbed1dfd615de82a473f51c9e37ea8c6c42eb838078b4 WatchSource:0}: Error finding container df5b7e0e9e66ebfe513cbed1dfd615de82a473f51c9e37ea8c6c42eb838078b4: Status 404 returned error can't find the container with id df5b7e0e9e66ebfe513cbed1dfd615de82a473f51c9e37ea8c6c42eb838078b4 Feb 16 21:59:44 crc kubenswrapper[4777]: I0216 21:59:44.979581 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dd8c5d5-n2w9s" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.085429 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.085659 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f8f666f74-96kzt" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-api" containerID="cri-o://1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59" gracePeriod=30 Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.086101 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f8f666f74-96kzt" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-httpd" containerID="cri-o://58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822" gracePeriod=30 Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.185379 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"206cc213-04d8-40c6-befb-914351a6c1fe","Type":"ContainerStarted","Data":"df5b7e0e9e66ebfe513cbed1dfd615de82a473f51c9e37ea8c6c42eb838078b4"} Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.201877 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"630e92a1-ae10-4fa5-a006-fac7619ca69b","Type":"ContainerDied","Data":"58287e0a8931a5f4864bc6f8c253b2dcde56392b6d35ba26552c21b4533acd11"} Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.201897 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.201924 4777 scope.go:117] "RemoveContainer" containerID="ef4f14124dffb958ed3c2a29c02c7b07e2bbb0f1414bd61882ba057b313b2fef" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.211307 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerStarted","Data":"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174"} Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.219196 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3dc126ff-fbc5-4c23-8a19-084e71677f29","Type":"ContainerStarted","Data":"4ca9b5468c01388be72ccda61bbd0b07d62726409459a085568a6c78972741ac"} Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.244032 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.267922 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.274273 4777 scope.go:117] "RemoveContainer" containerID="78881c3659493c7079f93fa1057574cdc44c1cdd9aae67eb0127eeb20fc1f778" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.275947 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276315 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api-log" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276332 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api-log" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276339 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276345 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276359 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22e9b3b-7b09-4704-a1aa-b219e416360c" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276366 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22e9b3b-7b09-4704-a1aa-b219e416360c" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276377 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276383 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276394 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1218f6-7d46-4c08-be98-e27104f96caf" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276400 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1218f6-7d46-4c08-be98-e27104f96caf" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276417 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c47b5e-f997-4751-9fe6-e35d904a7fea" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276422 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c47b5e-f997-4751-9fe6-e35d904a7fea" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276441 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df482342-2142-4778-9751-7f8c8daaccd8" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276446 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="df482342-2142-4778-9751-7f8c8daaccd8" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: E0216 21:59:45.276463 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95770934-2a37-4fc6-b3e4-5ffd1e429f4b" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276470 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="95770934-2a37-4fc6-b3e4-5ffd1e429f4b" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276638 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22e9b3b-7b09-4704-a1aa-b219e416360c" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276650 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276661 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="95770934-2a37-4fc6-b3e4-5ffd1e429f4b" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276674 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api-log" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276692 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276700 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="df482342-2142-4778-9751-7f8c8daaccd8" containerName="mariadb-account-create-update" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276709 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c47b5e-f997-4751-9fe6-e35d904a7fea" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.276736 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1218f6-7d46-4c08-be98-e27104f96caf" containerName="mariadb-database-create" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.278206 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.285134 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.285477 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.285911 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.295364 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.408292 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.408619 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-scripts\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.408665 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.408752 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.408784 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6e9a17-49c6-4977-af5d-f21441665952-logs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.409075 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6e9a17-49c6-4977-af5d-f21441665952-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.409161 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.409203 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.409281 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldl7\" (UniqueName: \"kubernetes.io/projected/3e6e9a17-49c6-4977-af5d-f21441665952-kube-api-access-dldl7\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511318 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6e9a17-49c6-4977-af5d-f21441665952-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511361 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511385 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511416 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dldl7\" (UniqueName: \"kubernetes.io/projected/3e6e9a17-49c6-4977-af5d-f21441665952-kube-api-access-dldl7\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511449 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511483 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-scripts\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511510 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511536 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6e9a17-49c6-4977-af5d-f21441665952-logs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.511550 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.512510 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6e9a17-49c6-4977-af5d-f21441665952-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.512979 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6e9a17-49c6-4977-af5d-f21441665952-logs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.516559 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-scripts\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.517517 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.517673 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.519408 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.519753 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-config-data\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.519895 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6e9a17-49c6-4977-af5d-f21441665952-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.531143 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldl7\" (UniqueName: \"kubernetes.io/projected/3e6e9a17-49c6-4977-af5d-f21441665952-kube-api-access-dldl7\") pod \"cinder-api-0\" (UID: \"3e6e9a17-49c6-4977-af5d-f21441665952\") " pod="openstack/cinder-api-0" Feb 16 21:59:45 crc kubenswrapper[4777]: I0216 21:59:45.622158 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.124029 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.195055 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" path="/var/lib/kubelet/pods/630e92a1-ae10-4fa5-a006-fac7619ca69b/volumes" Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.276164 4777 generic.go:334] "Generic (PLEG): container finished" podID="3e40580f-139c-49bd-9f65-72896786b32d" containerID="58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822" exitCode=0 Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.276237 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerDied","Data":"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.281320 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerStarted","Data":"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.281516 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-central-agent" containerID="cri-o://3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf" gracePeriod=30 Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.281601 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-notification-agent" containerID="cri-o://50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028" gracePeriod=30 Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.281617 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="sg-core" containerID="cri-o://a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174" gracePeriod=30 Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.284217 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3dc126ff-fbc5-4c23-8a19-084e71677f29","Type":"ContainerStarted","Data":"92f4232e696d1ea1dbbaa5c54e57a5a9bd147c6330d1d060522ab4d7227fb568"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.284259 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3dc126ff-fbc5-4c23-8a19-084e71677f29","Type":"ContainerStarted","Data":"01b76c0fdae98c3e7afc554df6b2ce571ebf3ce39c1a0b85f8da33cfe7c0d425"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.292032 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"206cc213-04d8-40c6-befb-914351a6c1fe","Type":"ContainerStarted","Data":"d0e20bcdbb7c9a5cbc24b042595879e02823c6dd3a096e1717ea05f6cee5aeda"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.293742 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6e9a17-49c6-4977-af5d-f21441665952","Type":"ContainerStarted","Data":"b8bc5195560490fe07a89243968475f7581a00f61e4427617bbfa8a4ca6e963a"} Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.335175 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.335149546 podStartE2EDuration="4.335149546s" podCreationTimestamp="2026-02-16 21:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:46.328963523 +0000 UTC m=+1306.911464625" watchObservedRunningTime="2026-02-16 21:59:46.335149546 +0000 UTC m=+1306.917650658" Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.997267 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7jcx"] Feb 16 21:59:46 crc kubenswrapper[4777]: I0216 21:59:46.999464 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.005849 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7jcx"] Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.014022 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.014248 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.017766 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cjxdb" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.148652 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.149041 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.149113 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnv59\" (UniqueName: \"kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.149219 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.253849 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.254149 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.254236 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnv59\" (UniqueName: \"kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.254323 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.261254 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.261259 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.264631 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.263616 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.286958 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnv59\" (UniqueName: \"kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59\") pod \"nova-cell0-conductor-db-sync-h7jcx\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.332861 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"206cc213-04d8-40c6-befb-914351a6c1fe","Type":"ContainerStarted","Data":"d3ccc2bfb72b32c723de503a5dd7ca49c89169c9330cc6d4f9bece0e0100890f"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.334999 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.365272 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6e9a17-49c6-4977-af5d-f21441665952","Type":"ContainerStarted","Data":"6780b495fe9630a92af0192cbddaa5c3c3fd7d84bfe85af3324295d15ad28151"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.367407 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7rdr\" (UniqueName: \"kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr\") pod \"3e40580f-139c-49bd-9f65-72896786b32d\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.367465 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle\") pod \"3e40580f-139c-49bd-9f65-72896786b32d\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.367524 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs\") pod \"3e40580f-139c-49bd-9f65-72896786b32d\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.367657 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config\") pod \"3e40580f-139c-49bd-9f65-72896786b32d\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.367698 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config\") pod \"3e40580f-139c-49bd-9f65-72896786b32d\" (UID: \"3e40580f-139c-49bd-9f65-72896786b32d\") " Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.371219 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr" (OuterVolumeSpecName: "kube-api-access-d7rdr") pod "3e40580f-139c-49bd-9f65-72896786b32d" (UID: "3e40580f-139c-49bd-9f65-72896786b32d"). InnerVolumeSpecName "kube-api-access-d7rdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.371226 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.371210823 podStartE2EDuration="4.371210823s" podCreationTimestamp="2026-02-16 21:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:47.365498133 +0000 UTC m=+1307.947999235" watchObservedRunningTime="2026-02-16 21:59:47.371210823 +0000 UTC m=+1307.953711925" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.372134 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3e40580f-139c-49bd-9f65-72896786b32d" (UID: "3e40580f-139c-49bd-9f65-72896786b32d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.376282 4777 generic.go:334] "Generic (PLEG): container finished" podID="3e40580f-139c-49bd-9f65-72896786b32d" containerID="1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59" exitCode=0 Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.376341 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerDied","Data":"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.376368 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f8f666f74-96kzt" event={"ID":"3e40580f-139c-49bd-9f65-72896786b32d","Type":"ContainerDied","Data":"5bb503efcff6dd33d7b2d6780a3b8374341813c5efc91b9cf66e151ba5805df9"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.376384 4777 scope.go:117] "RemoveContainer" containerID="58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.376505 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f8f666f74-96kzt" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.419675 4777 generic.go:334] "Generic (PLEG): container finished" podID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerID="dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4" exitCode=1 Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.419803 4777 generic.go:334] "Generic (PLEG): container finished" podID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerID="a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174" exitCode=2 Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.419812 4777 generic.go:334] "Generic (PLEG): container finished" podID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerID="50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028" exitCode=0 Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.420109 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerDied","Data":"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.420165 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerDied","Data":"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.420176 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerDied","Data":"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028"} Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.445117 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config" (OuterVolumeSpecName: "config") pod "3e40580f-139c-49bd-9f65-72896786b32d" (UID: "3e40580f-139c-49bd-9f65-72896786b32d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.445174 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e40580f-139c-49bd-9f65-72896786b32d" (UID: "3e40580f-139c-49bd-9f65-72896786b32d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.470795 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3e40580f-139c-49bd-9f65-72896786b32d" (UID: "3e40580f-139c-49bd-9f65-72896786b32d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.471554 4777 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.471633 4777 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.471693 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-config\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.471765 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7rdr\" (UniqueName: \"kubernetes.io/projected/3e40580f-139c-49bd-9f65-72896786b32d-kube-api-access-d7rdr\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.471833 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40580f-139c-49bd-9f65-72896786b32d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.478830 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.481011 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69c8b968f9-qldk7" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.602850 4777 scope.go:117] "RemoveContainer" containerID="1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.633283 4777 scope.go:117] "RemoveContainer" containerID="58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822" Feb 16 21:59:47 crc kubenswrapper[4777]: E0216 21:59:47.634231 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822\": container with ID starting with 58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822 not found: ID does not exist" containerID="58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.634262 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822"} err="failed to get container status \"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822\": rpc error: code = NotFound desc = could not find container \"58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822\": container with ID starting with 58ad15b20eb674e746f4757f73a5eda9a0c09fc50bb3ce526b056364925fa822 not found: ID does not exist" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.634283 4777 scope.go:117] "RemoveContainer" containerID="1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59" Feb 16 21:59:47 crc kubenswrapper[4777]: E0216 21:59:47.637178 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59\": container with ID starting with 1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59 not found: ID does not exist" containerID="1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.637768 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59"} err="failed to get container status \"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59\": rpc error: code = NotFound desc = could not find container \"1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59\": container with ID starting with 1312e34b15161c5e997310fc21afa4f790483e10caa0a46dc5feeaa1db66ec59 not found: ID does not exist" Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.718954 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.727530 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f8f666f74-96kzt"] Feb 16 21:59:47 crc kubenswrapper[4777]: W0216 21:59:47.990620 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2997f9cf_78c3_458d_b5c4_7c774d1c84a8.slice/crio-baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c WatchSource:0}: Error finding container baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c: Status 404 returned error can't find the container with id baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c Feb 16 21:59:47 crc kubenswrapper[4777]: I0216 21:59:47.991075 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7jcx"] Feb 16 21:59:48 crc kubenswrapper[4777]: I0216 21:59:48.193515 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e40580f-139c-49bd-9f65-72896786b32d" path="/var/lib/kubelet/pods/3e40580f-139c-49bd-9f65-72896786b32d/volumes" Feb 16 21:59:48 crc kubenswrapper[4777]: I0216 21:59:48.436990 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e6e9a17-49c6-4977-af5d-f21441665952","Type":"ContainerStarted","Data":"0579655a63e653f8f45914fb71c0d87fe6bfb63d9fdcdf81612753ac11923832"} Feb 16 21:59:48 crc kubenswrapper[4777]: I0216 21:59:48.437391 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 16 21:59:48 crc kubenswrapper[4777]: I0216 21:59:48.439702 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" event={"ID":"2997f9cf-78c3-458d-b5c4-7c774d1c84a8","Type":"ContainerStarted","Data":"baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c"} Feb 16 21:59:48 crc kubenswrapper[4777]: I0216 21:59:48.462232 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.462214018 podStartE2EDuration="3.462214018s" podCreationTimestamp="2026-02-16 21:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 21:59:48.456319453 +0000 UTC m=+1309.038820565" watchObservedRunningTime="2026-02-16 21:59:48.462214018 +0000 UTC m=+1309.044715130" Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.001043 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="630e92a1-ae10-4fa5-a006-fac7619ca69b" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.183:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.656068 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.665316 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57449bfc86-nk7r7" Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.737148 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.737558 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f8865555d-rstbr" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-api" containerID="cri-o://1270b0b911e5b618049966ffb4ab2a653fbf3f0e46723472dee8640b6326ac39" gracePeriod=30 Feb 16 21:59:49 crc kubenswrapper[4777]: I0216 21:59:49.737432 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f8865555d-rstbr" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-log" containerID="cri-o://518cbf48dfd1b55bc0d5bed9f412daa7dbb1c5792c0295cab9e0b4b3f2ddb3b7" gracePeriod=30 Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.343917 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436650 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436744 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942jp\" (UniqueName: \"kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436798 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436835 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436881 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436909 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.436972 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.437276 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.437615 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.438951 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.443766 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts" (OuterVolumeSpecName: "scripts") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.451901 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp" (OuterVolumeSpecName: "kube-api-access-942jp") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "kube-api-access-942jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.470909 4777 generic.go:334] "Generic (PLEG): container finished" podID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerID="3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf" exitCode=0 Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.470977 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerDied","Data":"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf"} Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.471007 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2efa4577-f0b4-4722-98eb-b438aaecd156","Type":"ContainerDied","Data":"0844569ded7d98e5e635b441c04e684401983b79387d3bd1ee221ac5602ed855"} Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.471026 4777 scope.go:117] "RemoveContainer" containerID="dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.471162 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.474553 4777 generic.go:334] "Generic (PLEG): container finished" podID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerID="518cbf48dfd1b55bc0d5bed9f412daa7dbb1c5792c0295cab9e0b4b3f2ddb3b7" exitCode=143 Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.474617 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerDied","Data":"518cbf48dfd1b55bc0d5bed9f412daa7dbb1c5792c0295cab9e0b4b3f2ddb3b7"} Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.488947 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.523687 4777 scope.go:117] "RemoveContainer" containerID="a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.541222 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.541253 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.541280 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2efa4577-f0b4-4722-98eb-b438aaecd156-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.541289 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942jp\" (UniqueName: \"kubernetes.io/projected/2efa4577-f0b4-4722-98eb-b438aaecd156-kube-api-access-942jp\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:50 crc kubenswrapper[4777]: E0216 21:59:50.544177 4777 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle podName:2efa4577-f0b4-4722-98eb-b438aaecd156 nodeName:}" failed. No retries permitted until 2026-02-16 21:59:51.044149488 +0000 UTC m=+1311.626650590 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156") : error deleting /var/lib/kubelet/pods/2efa4577-f0b4-4722-98eb-b438aaecd156/volume-subpaths: remove /var/lib/kubelet/pods/2efa4577-f0b4-4722-98eb-b438aaecd156/volume-subpaths: no such file or directory Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.546366 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data" (OuterVolumeSpecName: "config-data") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.546613 4777 scope.go:117] "RemoveContainer" containerID="50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.574137 4777 scope.go:117] "RemoveContainer" containerID="3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.595924 4777 scope.go:117] "RemoveContainer" containerID="dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4" Feb 16 21:59:50 crc kubenswrapper[4777]: E0216 21:59:50.596805 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4\": container with ID starting with dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4 not found: ID does not exist" containerID="dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.596844 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4"} err="failed to get container status \"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4\": rpc error: code = NotFound desc = could not find container \"dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4\": container with ID starting with dffa321ce36f9280bfb24278636b584a3868ec2eb27136f8d44d333788708fc4 not found: ID does not exist" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.596871 4777 scope.go:117] "RemoveContainer" containerID="a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174" Feb 16 21:59:50 crc kubenswrapper[4777]: E0216 21:59:50.597336 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174\": container with ID starting with a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174 not found: ID does not exist" containerID="a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.597364 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174"} err="failed to get container status \"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174\": rpc error: code = NotFound desc = could not find container \"a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174\": container with ID starting with a57cfd6ae32f328148dde1cd56730899201338d036d2feb1a0e8991c3e509174 not found: ID does not exist" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.597382 4777 scope.go:117] "RemoveContainer" containerID="50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028" Feb 16 21:59:50 crc kubenswrapper[4777]: E0216 21:59:50.597872 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028\": container with ID starting with 50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028 not found: ID does not exist" containerID="50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.597906 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028"} err="failed to get container status \"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028\": rpc error: code = NotFound desc = could not find container \"50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028\": container with ID starting with 50d746da3193df7e6523b0560ec63eeb70617a16d9f467ce233a4d67e9443028 not found: ID does not exist" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.597927 4777 scope.go:117] "RemoveContainer" containerID="3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf" Feb 16 21:59:50 crc kubenswrapper[4777]: E0216 21:59:50.598419 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf\": container with ID starting with 3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf not found: ID does not exist" containerID="3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.598463 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf"} err="failed to get container status \"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf\": rpc error: code = NotFound desc = could not find container \"3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf\": container with ID starting with 3432ac0a9b9af8c5ce883e7cb14637b33ccae3c4522a072a6e8270c986e0a1bf not found: ID does not exist" Feb 16 21:59:50 crc kubenswrapper[4777]: I0216 21:59:50.642647 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.049681 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") pod \"2efa4577-f0b4-4722-98eb-b438aaecd156\" (UID: \"2efa4577-f0b4-4722-98eb-b438aaecd156\") " Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.052501 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efa4577-f0b4-4722-98eb-b438aaecd156" (UID: "2efa4577-f0b4-4722-98eb-b438aaecd156"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.135568 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.141943 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.160310 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.161737 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efa4577-f0b4-4722-98eb-b438aaecd156-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162045 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-central-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162063 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-central-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162077 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="proxy-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162085 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="proxy-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162093 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-notification-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162100 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-notification-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162113 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162120 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162133 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="sg-core" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162139 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="sg-core" Feb 16 21:59:51 crc kubenswrapper[4777]: E0216 21:59:51.162154 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-api" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162161 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-api" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162447 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-api" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162468 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-notification-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162483 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="sg-core" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162513 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="ceilometer-central-agent" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162522 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e40580f-139c-49bd-9f65-72896786b32d" containerName="neutron-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.162532 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" containerName="proxy-httpd" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.164594 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.166753 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.166951 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.167071 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.197543 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263439 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263485 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263515 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263538 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263574 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263658 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263691 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrlb\" (UniqueName: \"kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.263724 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.334888 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365596 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365646 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365690 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365786 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365823 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrlb\" (UniqueName: \"kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365899 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.365915 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.366302 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.368294 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.372709 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.374069 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.377325 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.381926 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.382582 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.394701 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrlb\" (UniqueName: \"kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb\") pod \"ceilometer-0\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " pod="openstack/ceilometer-0" Feb 16 21:59:51 crc kubenswrapper[4777]: I0216 21:59:51.526412 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 21:59:52 crc kubenswrapper[4777]: I0216 21:59:52.194316 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efa4577-f0b4-4722-98eb-b438aaecd156" path="/var/lib/kubelet/pods/2efa4577-f0b4-4722-98eb-b438aaecd156/volumes" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.008943 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.010254 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.046342 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.053537 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: E0216 21:59:53.304872 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:59:53 crc kubenswrapper[4777]: E0216 21:59:53.304937 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 21:59:53 crc kubenswrapper[4777]: E0216 21:59:53.305125 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 21:59:53 crc kubenswrapper[4777]: E0216 21:59:53.306858 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.511567 4777 generic.go:334] "Generic (PLEG): container finished" podID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerID="1270b0b911e5b618049966ffb4ab2a653fbf3f0e46723472dee8640b6326ac39" exitCode=0 Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.511644 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerDied","Data":"1270b0b911e5b618049966ffb4ab2a653fbf3f0e46723472dee8640b6326ac39"} Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.512088 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.512113 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.831343 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.831398 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.878920 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:53 crc kubenswrapper[4777]: I0216 21:59:53.890202 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:54 crc kubenswrapper[4777]: I0216 21:59:54.524759 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:54 crc kubenswrapper[4777]: I0216 21:59:54.524805 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:55 crc kubenswrapper[4777]: I0216 21:59:55.475868 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 16 21:59:55 crc kubenswrapper[4777]: I0216 21:59:55.478624 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.325963 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.389366 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.596952 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669541 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ntrr\" (UniqueName: \"kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669580 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669611 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669644 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669672 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669752 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.669799 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs\") pod \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\" (UID: \"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9\") " Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.673165 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs" (OuterVolumeSpecName: "logs") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.680694 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts" (OuterVolumeSpecName: "scripts") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.686891 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr" (OuterVolumeSpecName: "kube-api-access-7ntrr") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "kube-api-access-7ntrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.774017 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.774058 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-logs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.774070 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ntrr\" (UniqueName: \"kubernetes.io/projected/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-kube-api-access-7ntrr\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.781050 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.789827 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.791837 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data" (OuterVolumeSpecName: "config-data") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.834809 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.836085 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" (UID: "e7d06dbb-1aa3-475f-bbc4-352bc009d3b9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.876470 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.876514 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.876527 4777 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:56 crc kubenswrapper[4777]: I0216 21:59:56.876538 4777 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.566474 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f8865555d-rstbr" event={"ID":"e7d06dbb-1aa3-475f-bbc4-352bc009d3b9","Type":"ContainerDied","Data":"2838d1c10fa666b0b9be6802131a1609d86d10c49576ec940a6eeb20bfbb17f7"} Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.566512 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f8865555d-rstbr" Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.566938 4777 scope.go:117] "RemoveContainer" containerID="1270b0b911e5b618049966ffb4ab2a653fbf3f0e46723472dee8640b6326ac39" Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.568966 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" event={"ID":"2997f9cf-78c3-458d-b5c4-7c774d1c84a8","Type":"ContainerStarted","Data":"b29243746f7ac132751aa74466f197b11898a9e98a2b8b1009dc3e5eecb977a6"} Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.570938 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerStarted","Data":"cd608641f762b9b424242ef9eeda44be1a3cea72015e9344aac1434a2b9a1358"} Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.587034 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" podStartSLOduration=3.310378823 podStartE2EDuration="11.587015748s" podCreationTimestamp="2026-02-16 21:59:46 +0000 UTC" firstStartedPulling="2026-02-16 21:59:47.993272654 +0000 UTC m=+1308.575773756" lastFinishedPulling="2026-02-16 21:59:56.269909579 +0000 UTC m=+1316.852410681" observedRunningTime="2026-02-16 21:59:57.581347179 +0000 UTC m=+1318.163848281" watchObservedRunningTime="2026-02-16 21:59:57.587015748 +0000 UTC m=+1318.169516850" Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.589921 4777 scope.go:117] "RemoveContainer" containerID="518cbf48dfd1b55bc0d5bed9f412daa7dbb1c5792c0295cab9e0b4b3f2ddb3b7" Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.610387 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:59:57 crc kubenswrapper[4777]: I0216 21:59:57.621336 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6f8865555d-rstbr"] Feb 16 21:59:58 crc kubenswrapper[4777]: I0216 21:59:58.118355 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 16 21:59:58 crc kubenswrapper[4777]: I0216 21:59:58.200400 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" path="/var/lib/kubelet/pods/e7d06dbb-1aa3-475f-bbc4-352bc009d3b9/volumes" Feb 16 21:59:58 crc kubenswrapper[4777]: I0216 21:59:58.595836 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerStarted","Data":"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c"} Feb 16 21:59:58 crc kubenswrapper[4777]: I0216 21:59:58.596194 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerStarted","Data":"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1"} Feb 16 21:59:59 crc kubenswrapper[4777]: I0216 21:59:59.615231 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerStarted","Data":"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7"} Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.139857 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h"] Feb 16 22:00:00 crc kubenswrapper[4777]: E0216 22:00:00.140590 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-log" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.140609 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-log" Feb 16 22:00:00 crc kubenswrapper[4777]: E0216 22:00:00.140640 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-api" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.140648 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-api" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.140905 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-log" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.140920 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d06dbb-1aa3-475f-bbc4-352bc009d3b9" containerName="placement-api" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.141892 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.145342 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.145592 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.178877 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h"] Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.244231 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.244733 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bt5t\" (UniqueName: \"kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.244788 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.279386 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.347070 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bt5t\" (UniqueName: \"kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.347116 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.347155 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.348280 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.352180 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.378956 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bt5t\" (UniqueName: \"kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t\") pod \"collect-profiles-29521320-9rh8h\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.484588 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.628325 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerStarted","Data":"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729"} Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.629800 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.664420 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.4970209279999995 podStartE2EDuration="9.664401717s" podCreationTimestamp="2026-02-16 21:59:51 +0000 UTC" firstStartedPulling="2026-02-16 21:59:56.784842681 +0000 UTC m=+1317.367343783" lastFinishedPulling="2026-02-16 21:59:59.95222344 +0000 UTC m=+1320.534724572" observedRunningTime="2026-02-16 22:00:00.657069351 +0000 UTC m=+1321.239570453" watchObservedRunningTime="2026-02-16 22:00:00.664401717 +0000 UTC m=+1321.246902819" Feb 16 22:00:00 crc kubenswrapper[4777]: W0216 22:00:00.995414 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda54354b_4d85_416a_a642_980289af6893.slice/crio-6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92 WatchSource:0}: Error finding container 6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92: Status 404 returned error can't find the container with id 6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92 Feb 16 22:00:00 crc kubenswrapper[4777]: I0216 22:00:00.996631 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h"] Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642080 4777 generic.go:334] "Generic (PLEG): container finished" podID="da54354b-4d85-416a-a642-980289af6893" containerID="7a4d3282750566a558e0767fe2f040e547c90c54304375242205bde79e25df2a" exitCode=0 Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642141 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" event={"ID":"da54354b-4d85-416a-a642-980289af6893","Type":"ContainerDied","Data":"7a4d3282750566a558e0767fe2f040e547c90c54304375242205bde79e25df2a"} Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642437 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" event={"ID":"da54354b-4d85-416a-a642-980289af6893","Type":"ContainerStarted","Data":"6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92"} Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642633 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-central-agent" containerID="cri-o://de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1" gracePeriod=30 Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642676 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-notification-agent" containerID="cri-o://efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c" gracePeriod=30 Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642704 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="sg-core" containerID="cri-o://96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7" gracePeriod=30 Feb 16 22:00:01 crc kubenswrapper[4777]: I0216 22:00:01.642674 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="proxy-httpd" containerID="cri-o://ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729" gracePeriod=30 Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654329 4777 generic.go:334] "Generic (PLEG): container finished" podID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerID="ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729" exitCode=0 Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654628 4777 generic.go:334] "Generic (PLEG): container finished" podID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerID="96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7" exitCode=2 Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654637 4777 generic.go:334] "Generic (PLEG): container finished" podID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerID="efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c" exitCode=0 Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654374 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerDied","Data":"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729"} Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654696 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerDied","Data":"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7"} Feb 16 22:00:02 crc kubenswrapper[4777]: I0216 22:00:02.654730 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerDied","Data":"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c"} Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.108386 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.240161 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bt5t\" (UniqueName: \"kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t\") pod \"da54354b-4d85-416a-a642-980289af6893\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.240408 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume\") pod \"da54354b-4d85-416a-a642-980289af6893\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.240487 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume\") pod \"da54354b-4d85-416a-a642-980289af6893\" (UID: \"da54354b-4d85-416a-a642-980289af6893\") " Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.241261 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume" (OuterVolumeSpecName: "config-volume") pod "da54354b-4d85-416a-a642-980289af6893" (UID: "da54354b-4d85-416a-a642-980289af6893"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.246123 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da54354b-4d85-416a-a642-980289af6893" (UID: "da54354b-4d85-416a-a642-980289af6893"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.251140 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t" (OuterVolumeSpecName: "kube-api-access-6bt5t") pod "da54354b-4d85-416a-a642-980289af6893" (UID: "da54354b-4d85-416a-a642-980289af6893"). InnerVolumeSpecName "kube-api-access-6bt5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.343279 4777 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da54354b-4d85-416a-a642-980289af6893-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.343312 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da54354b-4d85-416a-a642-980289af6893-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.343324 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bt5t\" (UniqueName: \"kubernetes.io/projected/da54354b-4d85-416a-a642-980289af6893-kube-api-access-6bt5t\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.666834 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" event={"ID":"da54354b-4d85-416a-a642-980289af6893","Type":"ContainerDied","Data":"6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92"} Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.666897 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d234c7e35b292fb462973319fbd11d821b10981088c492d5cf0b5fc0b428c92" Feb 16 22:00:03 crc kubenswrapper[4777]: I0216 22:00:03.666961 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521320-9rh8h" Feb 16 22:00:07 crc kubenswrapper[4777]: E0216 22:00:07.184674 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:00:09 crc kubenswrapper[4777]: I0216 22:00:09.742203 4777 generic.go:334] "Generic (PLEG): container finished" podID="2997f9cf-78c3-458d-b5c4-7c774d1c84a8" containerID="b29243746f7ac132751aa74466f197b11898a9e98a2b8b1009dc3e5eecb977a6" exitCode=0 Feb 16 22:00:09 crc kubenswrapper[4777]: I0216 22:00:09.742289 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" event={"ID":"2997f9cf-78c3-458d-b5c4-7c774d1c84a8","Type":"ContainerDied","Data":"b29243746f7ac132751aa74466f197b11898a9e98a2b8b1009dc3e5eecb977a6"} Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.344915 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.473848 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts\") pod \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.473923 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnv59\" (UniqueName: \"kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59\") pod \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.474014 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle\") pod \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.474089 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data\") pod \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\" (UID: \"2997f9cf-78c3-458d-b5c4-7c774d1c84a8\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.481951 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59" (OuterVolumeSpecName: "kube-api-access-mnv59") pod "2997f9cf-78c3-458d-b5c4-7c774d1c84a8" (UID: "2997f9cf-78c3-458d-b5c4-7c774d1c84a8"). InnerVolumeSpecName "kube-api-access-mnv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.499975 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts" (OuterVolumeSpecName: "scripts") pod "2997f9cf-78c3-458d-b5c4-7c774d1c84a8" (UID: "2997f9cf-78c3-458d-b5c4-7c774d1c84a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.505902 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data" (OuterVolumeSpecName: "config-data") pod "2997f9cf-78c3-458d-b5c4-7c774d1c84a8" (UID: "2997f9cf-78c3-458d-b5c4-7c774d1c84a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.517926 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2997f9cf-78c3-458d-b5c4-7c774d1c84a8" (UID: "2997f9cf-78c3-458d-b5c4-7c774d1c84a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.576181 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.576210 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.576219 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnv59\" (UniqueName: \"kubernetes.io/projected/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-kube-api-access-mnv59\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.576229 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2997f9cf-78c3-458d-b5c4-7c774d1c84a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.608970 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.652763 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.653024 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676602 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676634 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676663 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrlb\" (UniqueName: \"kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676682 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676697 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676796 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676818 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.676880 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml\") pod \"bfc01a77-b188-4af1-8354-0fd361f6df84\" (UID: \"bfc01a77-b188-4af1-8354-0fd361f6df84\") " Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.677426 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.677465 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.680899 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb" (OuterVolumeSpecName: "kube-api-access-7xrlb") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "kube-api-access-7xrlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.680949 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts" (OuterVolumeSpecName: "scripts") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.712766 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.731320 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.750544 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.767211 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" event={"ID":"2997f9cf-78c3-458d-b5c4-7c774d1c84a8","Type":"ContainerDied","Data":"baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c"} Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.767249 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baef23591189686a728982bffc0d4d112f9b2718cceacd0f698a6542abd18a1c" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.767322 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h7jcx" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.771132 4777 generic.go:334] "Generic (PLEG): container finished" podID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerID="de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1" exitCode=0 Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.771179 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerDied","Data":"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1"} Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.771213 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfc01a77-b188-4af1-8354-0fd361f6df84","Type":"ContainerDied","Data":"cd608641f762b9b424242ef9eeda44be1a3cea72015e9344aac1434a2b9a1358"} Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.771236 4777 scope.go:117] "RemoveContainer" containerID="ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.771401 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779516 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779558 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779573 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779587 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfc01a77-b188-4af1-8354-0fd361f6df84-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779599 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrlb\" (UniqueName: \"kubernetes.io/projected/bfc01a77-b188-4af1-8354-0fd361f6df84-kube-api-access-7xrlb\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779612 4777 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.779630 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.788540 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data" (OuterVolumeSpecName: "config-data") pod "bfc01a77-b188-4af1-8354-0fd361f6df84" (UID: "bfc01a77-b188-4af1-8354-0fd361f6df84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.805673 4777 scope.go:117] "RemoveContainer" containerID="96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.828210 4777 scope.go:117] "RemoveContainer" containerID="efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.850981 4777 scope.go:117] "RemoveContainer" containerID="de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.880239 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc01a77-b188-4af1-8354-0fd361f6df84-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.891849 4777 scope.go:117] "RemoveContainer" containerID="ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892171 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892502 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da54354b-4d85-416a-a642-980289af6893" containerName="collect-profiles" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892517 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="da54354b-4d85-416a-a642-980289af6893" containerName="collect-profiles" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892540 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2997f9cf-78c3-458d-b5c4-7c774d1c84a8" containerName="nova-cell0-conductor-db-sync" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892547 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2997f9cf-78c3-458d-b5c4-7c774d1c84a8" containerName="nova-cell0-conductor-db-sync" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892558 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="proxy-httpd" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892563 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="proxy-httpd" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892573 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-notification-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892578 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-notification-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892588 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-central-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892594 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-central-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.892607 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="sg-core" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892617 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="sg-core" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892788 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2997f9cf-78c3-458d-b5c4-7c774d1c84a8" containerName="nova-cell0-conductor-db-sync" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892802 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-notification-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892814 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="proxy-httpd" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892823 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="da54354b-4d85-416a-a642-980289af6893" containerName="collect-profiles" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892837 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="ceilometer-central-agent" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.892850 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" containerName="sg-core" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.897860 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729\": container with ID starting with ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729 not found: ID does not exist" containerID="ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.897899 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729"} err="failed to get container status \"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729\": rpc error: code = NotFound desc = could not find container \"ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729\": container with ID starting with ef9706f31bcb6b8fe1faa9ea900f4b63e932f25b21159ef8c2b8526d4da43729 not found: ID does not exist" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.897925 4777 scope.go:117] "RemoveContainer" containerID="96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.898542 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.898566 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7\": container with ID starting with 96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7 not found: ID does not exist" containerID="96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.898612 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7"} err="failed to get container status \"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7\": rpc error: code = NotFound desc = could not find container \"96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7\": container with ID starting with 96d955cdb09fd62e057833c81506470e67caa1bca2f218fe3b0a9fbd15b01ee7 not found: ID does not exist" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.898640 4777 scope.go:117] "RemoveContainer" containerID="efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.899191 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c\": container with ID starting with efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c not found: ID does not exist" containerID="efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.899223 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c"} err="failed to get container status \"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c\": rpc error: code = NotFound desc = could not find container \"efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c\": container with ID starting with efcf3f2de58ee83b7b9c0afe1330c5cbfe9c5c4229b515b85f95b106f4a0d33c not found: ID does not exist" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.899258 4777 scope.go:117] "RemoveContainer" containerID="de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1" Feb 16 22:00:11 crc kubenswrapper[4777]: E0216 22:00:11.899611 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1\": container with ID starting with de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1 not found: ID does not exist" containerID="de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.899655 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1"} err="failed to get container status \"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1\": rpc error: code = NotFound desc = could not find container \"de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1\": container with ID starting with de3f14d07890a57995d471d48e50b81d0ef42718a6722987e7ed8cc0327feba1 not found: ID does not exist" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.902614 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.902845 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cjxdb" Feb 16 22:00:11 crc kubenswrapper[4777]: I0216 22:00:11.919981 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.084241 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljd7\" (UniqueName: \"kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.084303 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.084677 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.105445 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.117167 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.136819 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.142291 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.144275 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.144784 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.146598 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.153018 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.192703 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.192850 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljd7\" (UniqueName: \"kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.192935 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.200302 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.200791 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.205574 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc01a77-b188-4af1-8354-0fd361f6df84" path="/var/lib/kubelet/pods/bfc01a77-b188-4af1-8354-0fd361f6df84/volumes" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.206369 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.216211 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljd7\" (UniqueName: \"kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7\") pod \"nova-cell0-conductor-0\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: E0216 22:00:12.227109 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zljd7], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell0-conductor-0" podUID="fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294291 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294341 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294390 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294655 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294747 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294781 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwnt\" (UniqueName: \"kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.294822 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.295083 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397082 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397382 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397411 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397846 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397936 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397966 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.397991 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwnt\" (UniqueName: \"kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.398013 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.398224 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.398465 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.402078 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.403525 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.403980 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.404990 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.405596 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.428385 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwnt\" (UniqueName: \"kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt\") pod \"ceilometer-0\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.602675 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.804547 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:12 crc kubenswrapper[4777]: I0216 22:00:12.823370 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.014115 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zljd7\" (UniqueName: \"kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7\") pod \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.014200 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data\") pod \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.014248 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle\") pod \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\" (UID: \"fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a\") " Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.019853 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data" (OuterVolumeSpecName: "config-data") pod "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a" (UID: "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.019880 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a" (UID: "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.024386 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7" (OuterVolumeSpecName: "kube-api-access-zljd7") pod "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a" (UID: "fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a"). InnerVolumeSpecName "kube-api-access-zljd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.104351 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.116095 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zljd7\" (UniqueName: \"kubernetes.io/projected/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-kube-api-access-zljd7\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.116138 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.116147 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.821930 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.821944 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerStarted","Data":"446922b5ed74b87828debe8344b2f3bde15c74baca6729bc91974d2741c96fac"} Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.902889 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.918757 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.928630 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.931204 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.934038 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.935021 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cjxdb" Feb 16 22:00:13 crc kubenswrapper[4777]: I0216 22:00:13.939575 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.038643 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.038687 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnkn8\" (UniqueName: \"kubernetes.io/projected/b23d431a-b9f1-4151-9189-f027693eaabf-kube-api-access-lnkn8\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.038708 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.060640 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.140931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.140978 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnkn8\" (UniqueName: \"kubernetes.io/projected/b23d431a-b9f1-4151-9189-f027693eaabf-kube-api-access-lnkn8\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.141000 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.147185 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.147620 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23d431a-b9f1-4151-9189-f027693eaabf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.170193 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnkn8\" (UniqueName: \"kubernetes.io/projected/b23d431a-b9f1-4151-9189-f027693eaabf-kube-api-access-lnkn8\") pod \"nova-cell0-conductor-0\" (UID: \"b23d431a-b9f1-4151-9189-f027693eaabf\") " pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.192875 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a" path="/var/lib/kubelet/pods/fd0a8fdf-dc9c-4f9b-8fe2-f61baff66f2a/volumes" Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.254429 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:14 crc kubenswrapper[4777]: W0216 22:00:14.726068 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb23d431a_b9f1_4151_9189_f027693eaabf.slice/crio-8d7a774f48d7717beb9d640276f9b106540aedab8563808d0fadc2b1c6168170 WatchSource:0}: Error finding container 8d7a774f48d7717beb9d640276f9b106540aedab8563808d0fadc2b1c6168170: Status 404 returned error can't find the container with id 8d7a774f48d7717beb9d640276f9b106540aedab8563808d0fadc2b1c6168170 Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.728106 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.840372 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerStarted","Data":"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc"} Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.840419 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerStarted","Data":"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443"} Feb 16 22:00:14 crc kubenswrapper[4777]: I0216 22:00:14.841748 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b23d431a-b9f1-4151-9189-f027693eaabf","Type":"ContainerStarted","Data":"8d7a774f48d7717beb9d640276f9b106540aedab8563808d0fadc2b1c6168170"} Feb 16 22:00:15 crc kubenswrapper[4777]: I0216 22:00:15.853236 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerStarted","Data":"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44"} Feb 16 22:00:15 crc kubenswrapper[4777]: I0216 22:00:15.856471 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b23d431a-b9f1-4151-9189-f027693eaabf","Type":"ContainerStarted","Data":"31de10185ac73bd468e0710371a5cb47343770506faa6759bdfb05a01dbc8bf5"} Feb 16 22:00:15 crc kubenswrapper[4777]: I0216 22:00:15.856646 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:15 crc kubenswrapper[4777]: I0216 22:00:15.881109 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.881089722 podStartE2EDuration="2.881089722s" podCreationTimestamp="2026-02-16 22:00:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:15.872668946 +0000 UTC m=+1336.455170048" watchObservedRunningTime="2026-02-16 22:00:15.881089722 +0000 UTC m=+1336.463590824" Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.893847 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerStarted","Data":"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465"} Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.894093 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-central-agent" containerID="cri-o://e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443" gracePeriod=30 Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.894132 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="sg-core" containerID="cri-o://e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44" gracePeriod=30 Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.894149 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-notification-agent" containerID="cri-o://7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc" gracePeriod=30 Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.894365 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.894143 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="proxy-httpd" containerID="cri-o://7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465" gracePeriod=30 Feb 16 22:00:16 crc kubenswrapper[4777]: I0216 22:00:16.931096 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.525707924 podStartE2EDuration="4.931066788s" podCreationTimestamp="2026-02-16 22:00:12 +0000 UTC" firstStartedPulling="2026-02-16 22:00:13.113805258 +0000 UTC m=+1333.696306360" lastFinishedPulling="2026-02-16 22:00:16.519164102 +0000 UTC m=+1337.101665224" observedRunningTime="2026-02-16 22:00:16.923416464 +0000 UTC m=+1337.505917596" watchObservedRunningTime="2026-02-16 22:00:16.931066788 +0000 UTC m=+1337.513567920" Feb 16 22:00:17 crc kubenswrapper[4777]: I0216 22:00:17.912628 4777 generic.go:334] "Generic (PLEG): container finished" podID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerID="e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44" exitCode=2 Feb 16 22:00:17 crc kubenswrapper[4777]: I0216 22:00:17.913101 4777 generic.go:334] "Generic (PLEG): container finished" podID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerID="7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc" exitCode=0 Feb 16 22:00:17 crc kubenswrapper[4777]: I0216 22:00:17.912733 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerDied","Data":"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44"} Feb 16 22:00:17 crc kubenswrapper[4777]: I0216 22:00:17.913154 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerDied","Data":"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc"} Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.298856 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.856037 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m5pdh"] Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.857727 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.859898 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.860208 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.873265 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m5pdh"] Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.978539 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.978966 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.979003 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szh9\" (UniqueName: \"kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:19 crc kubenswrapper[4777]: I0216 22:00:19.979212 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.009762 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.011436 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.018230 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.024399 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.083022 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.083100 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.083129 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szh9\" (UniqueName: \"kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.083206 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.094426 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.108454 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.120144 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.146510 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.161269 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.169093 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.193077 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7547n\" (UniqueName: \"kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.193210 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.193665 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.198842 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.242756 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szh9\" (UniqueName: \"kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9\") pod \"nova-cell0-cell-mapping-m5pdh\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: E0216 22:00:20.248549 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.306119 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.313472 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7547n\" (UniqueName: \"kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.313574 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc55k\" (UniqueName: \"kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.313695 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.313892 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.313968 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.314071 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.314221 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.314266 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.315637 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.319110 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.325087 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.330042 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.332981 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.335222 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.341282 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7547n\" (UniqueName: \"kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n\") pod \"nova-api-0\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.341590 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.342120 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.344594 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.360701 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.360958 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.406645 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.416827 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.416889 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.417021 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.417065 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncw55\" (UniqueName: \"kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.417189 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.417208 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc55k\" (UniqueName: \"kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.426641 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.431259 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.433882 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.434089 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.438862 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc55k\" (UniqueName: \"kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k\") pod \"nova-scheduler-0\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.493221 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519179 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8fl2\" (UniqueName: \"kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519230 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519288 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519317 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519361 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwrk\" (UniqueName: \"kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519398 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519415 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.519538 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.521868 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.521977 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.522013 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.522057 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncw55\" (UniqueName: \"kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.522115 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.530672 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.530747 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.538595 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncw55\" (UniqueName: \"kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55\") pod \"nova-cell1-novncproxy-0\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.618358 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.623404 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.623458 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwrk\" (UniqueName: \"kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.623498 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.623517 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.624275 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.625165 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.625571 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.625581 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.623543 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.625674 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.626038 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.627009 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.627852 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.627984 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8fl2\" (UniqueName: \"kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.629141 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.632531 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.632947 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.635592 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.640244 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwrk\" (UniqueName: \"kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk\") pod \"nova-metadata-0\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.651739 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8fl2\" (UniqueName: \"kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2\") pod \"dnsmasq-dns-757b4f8459-llbkr\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.751279 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.760865 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.784315 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.884114 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:20 crc kubenswrapper[4777]: I0216 22:00:20.956841 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerStarted","Data":"1ebc8ab7d534768f57be6f4890692cce60fbf206cd887bb35d2953e4ceb1ce85"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.017111 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2gskq"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.019070 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.021013 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.021494 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.031705 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2gskq"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.076874 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m5pdh"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.146637 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.146685 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcts\" (UniqueName: \"kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.146857 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.146885 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.223430 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.252133 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.252172 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.252277 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.252310 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcts\" (UniqueName: \"kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.256892 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.256896 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.261447 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.269680 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcts\" (UniqueName: \"kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts\") pod \"nova-cell1-conductor-db-sync-2gskq\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.347395 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.463789 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:21 crc kubenswrapper[4777]: W0216 22:00:21.476038 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb43568b7_224c_4ace_b602_c0c7008ab829.slice/crio-6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16 WatchSource:0}: Error finding container 6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16: Status 404 returned error can't find the container with id 6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16 Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.583761 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.599877 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.819330 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2gskq"] Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.974532 4777 generic.go:334] "Generic (PLEG): container finished" podID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerID="37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612" exitCode=0 Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.975548 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" event={"ID":"37414fb0-84e5-436f-9e5e-d6bdaba54c8c","Type":"ContainerDied","Data":"37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.975582 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" event={"ID":"37414fb0-84e5-436f-9e5e-d6bdaba54c8c","Type":"ContainerStarted","Data":"75cda06423bb8b5482f1ad1fb2fc57cbf606d90b07e0b014f990fd0fb2f98995"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.982097 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m5pdh" event={"ID":"2cd16d27-1038-4aba-89fa-c789dfb631af","Type":"ContainerStarted","Data":"199065924a96a5dec9434b19498eab9aab5d5463dee5bb8751d6e560bd611b3b"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.982130 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m5pdh" event={"ID":"2cd16d27-1038-4aba-89fa-c789dfb631af","Type":"ContainerStarted","Data":"ee70a90230d9fa3db7e6c6a4c06cde5d89d35546b2f484b76e61f4052a8a377f"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.984796 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerStarted","Data":"0e8f5905c9fe0cca8e3b62f0354b0869ee43e6eb18eb54de3e1b16afc9496c9e"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.993162 4777 generic.go:334] "Generic (PLEG): container finished" podID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerID="e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443" exitCode=0 Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.993228 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerDied","Data":"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443"} Feb 16 22:00:21 crc kubenswrapper[4777]: I0216 22:00:21.995094 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2gskq" event={"ID":"b6f6de59-67d3-49c4-983f-5352c6178e5c","Type":"ContainerStarted","Data":"6ef55c18fac68cfefa2e61a6c4620ee906d1141d635cfd0892200a547ce241a3"} Feb 16 22:00:22 crc kubenswrapper[4777]: I0216 22:00:22.001212 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18de186-7ada-4893-900b-974aac51a495","Type":"ContainerStarted","Data":"759832d7b93c75f12f7859ec3ec3bd780fd2dfb3c7e6d0d4fda1435f10942fc3"} Feb 16 22:00:22 crc kubenswrapper[4777]: I0216 22:00:22.004070 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b43568b7-224c-4ace-b602-c0c7008ab829","Type":"ContainerStarted","Data":"6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16"} Feb 16 22:00:22 crc kubenswrapper[4777]: I0216 22:00:22.023266 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m5pdh" podStartSLOduration=3.023247106 podStartE2EDuration="3.023247106s" podCreationTimestamp="2026-02-16 22:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:22.015845448 +0000 UTC m=+1342.598346550" watchObservedRunningTime="2026-02-16 22:00:22.023247106 +0000 UTC m=+1342.605748208" Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.018396 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" event={"ID":"37414fb0-84e5-436f-9e5e-d6bdaba54c8c","Type":"ContainerStarted","Data":"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7"} Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.018550 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.023664 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2gskq" event={"ID":"b6f6de59-67d3-49c4-983f-5352c6178e5c","Type":"ContainerStarted","Data":"b23ad5eb9ad435dd251c02ab2d2791e4c6c439edb4a3737c0f08f4ab7cb7beaa"} Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.046906 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" podStartSLOduration=3.046886835 podStartE2EDuration="3.046886835s" podCreationTimestamp="2026-02-16 22:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:23.03599232 +0000 UTC m=+1343.618493422" watchObservedRunningTime="2026-02-16 22:00:23.046886835 +0000 UTC m=+1343.629387947" Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.076541 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2gskq" podStartSLOduration=3.076523665 podStartE2EDuration="3.076523665s" podCreationTimestamp="2026-02-16 22:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:23.047735119 +0000 UTC m=+1343.630236231" watchObservedRunningTime="2026-02-16 22:00:23.076523665 +0000 UTC m=+1343.659024767" Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.689855 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:23 crc kubenswrapper[4777]: I0216 22:00:23.699596 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.051890 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18de186-7ada-4893-900b-974aac51a495","Type":"ContainerStarted","Data":"ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.056246 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b43568b7-224c-4ace-b602-c0c7008ab829","Type":"ContainerStarted","Data":"7f86f61bf328e1461a3a3cb5ab33db2178c5b3b20efc8fd3bf80d8b356ba838a"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.056366 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b43568b7-224c-4ace-b602-c0c7008ab829" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7f86f61bf328e1461a3a3cb5ab33db2178c5b3b20efc8fd3bf80d8b356ba838a" gracePeriod=30 Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.063610 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerStarted","Data":"5bf312845e6a092d578609bd2019e50ffaff95942213bc94877e84e270983af5"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.063651 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerStarted","Data":"5c714ae03515451f58c19f4f755969d63cee1ae5ea290e98e98e2abeaaba0119"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.063782 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-log" containerID="cri-o://5c714ae03515451f58c19f4f755969d63cee1ae5ea290e98e98e2abeaaba0119" gracePeriod=30 Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.063868 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-metadata" containerID="cri-o://5bf312845e6a092d578609bd2019e50ffaff95942213bc94877e84e270983af5" gracePeriod=30 Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.067938 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerStarted","Data":"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.067981 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerStarted","Data":"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b"} Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.081219 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.985708395 podStartE2EDuration="5.08120257s" podCreationTimestamp="2026-02-16 22:00:20 +0000 UTC" firstStartedPulling="2026-02-16 22:00:21.245598246 +0000 UTC m=+1341.828099348" lastFinishedPulling="2026-02-16 22:00:24.341092411 +0000 UTC m=+1344.923593523" observedRunningTime="2026-02-16 22:00:25.072297651 +0000 UTC m=+1345.654798753" watchObservedRunningTime="2026-02-16 22:00:25.08120257 +0000 UTC m=+1345.663703672" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.101925 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.242036144 podStartE2EDuration="5.10190223s" podCreationTimestamp="2026-02-16 22:00:20 +0000 UTC" firstStartedPulling="2026-02-16 22:00:21.479157047 +0000 UTC m=+1342.061658149" lastFinishedPulling="2026-02-16 22:00:24.339023133 +0000 UTC m=+1344.921524235" observedRunningTime="2026-02-16 22:00:25.090325846 +0000 UTC m=+1345.672826958" watchObservedRunningTime="2026-02-16 22:00:25.10190223 +0000 UTC m=+1345.684403332" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.116007 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.378660829 podStartE2EDuration="5.115983574s" podCreationTimestamp="2026-02-16 22:00:20 +0000 UTC" firstStartedPulling="2026-02-16 22:00:21.602275895 +0000 UTC m=+1342.184776997" lastFinishedPulling="2026-02-16 22:00:24.33959864 +0000 UTC m=+1344.922099742" observedRunningTime="2026-02-16 22:00:25.114107202 +0000 UTC m=+1345.696608314" watchObservedRunningTime="2026-02-16 22:00:25.115983574 +0000 UTC m=+1345.698484676" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.141829 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.740586828 podStartE2EDuration="6.141811467s" podCreationTimestamp="2026-02-16 22:00:19 +0000 UTC" firstStartedPulling="2026-02-16 22:00:20.938375121 +0000 UTC m=+1341.520876223" lastFinishedPulling="2026-02-16 22:00:24.33959976 +0000 UTC m=+1344.922100862" observedRunningTime="2026-02-16 22:00:25.131044016 +0000 UTC m=+1345.713545118" watchObservedRunningTime="2026-02-16 22:00:25.141811467 +0000 UTC m=+1345.724312569" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.619964 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.752860 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.761172 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:00:25 crc kubenswrapper[4777]: I0216 22:00:25.761306 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:00:26 crc kubenswrapper[4777]: I0216 22:00:26.085770 4777 generic.go:334] "Generic (PLEG): container finished" podID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerID="5c714ae03515451f58c19f4f755969d63cee1ae5ea290e98e98e2abeaaba0119" exitCode=143 Feb 16 22:00:26 crc kubenswrapper[4777]: I0216 22:00:26.085858 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerDied","Data":"5c714ae03515451f58c19f4f755969d63cee1ae5ea290e98e98e2abeaaba0119"} Feb 16 22:00:29 crc kubenswrapper[4777]: I0216 22:00:29.133153 4777 generic.go:334] "Generic (PLEG): container finished" podID="b6f6de59-67d3-49c4-983f-5352c6178e5c" containerID="b23ad5eb9ad435dd251c02ab2d2791e4c6c439edb4a3737c0f08f4ab7cb7beaa" exitCode=0 Feb 16 22:00:29 crc kubenswrapper[4777]: I0216 22:00:29.133250 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2gskq" event={"ID":"b6f6de59-67d3-49c4-983f-5352c6178e5c","Type":"ContainerDied","Data":"b23ad5eb9ad435dd251c02ab2d2791e4c6c439edb4a3737c0f08f4ab7cb7beaa"} Feb 16 22:00:29 crc kubenswrapper[4777]: I0216 22:00:29.137950 4777 generic.go:334] "Generic (PLEG): container finished" podID="2cd16d27-1038-4aba-89fa-c789dfb631af" containerID="199065924a96a5dec9434b19498eab9aab5d5463dee5bb8751d6e560bd611b3b" exitCode=0 Feb 16 22:00:29 crc kubenswrapper[4777]: I0216 22:00:29.138017 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m5pdh" event={"ID":"2cd16d27-1038-4aba-89fa-c789dfb631af","Type":"ContainerDied","Data":"199065924a96a5dec9434b19498eab9aab5d5463dee5bb8751d6e560bd611b3b"} Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.343270 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.343568 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.624415 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.663816 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.785867 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.791198 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.797591 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.886602 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.886971 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="dnsmasq-dns" containerID="cri-o://6435afb5bc5b7a6643a652deb5a732e92ac2c0d0a5b48c9e0ab848d9688a7813" gracePeriod=10 Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920301 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qcts\" (UniqueName: \"kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts\") pod \"b6f6de59-67d3-49c4-983f-5352c6178e5c\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920349 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle\") pod \"b6f6de59-67d3-49c4-983f-5352c6178e5c\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920443 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data\") pod \"2cd16d27-1038-4aba-89fa-c789dfb631af\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920477 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle\") pod \"2cd16d27-1038-4aba-89fa-c789dfb631af\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920519 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data\") pod \"b6f6de59-67d3-49c4-983f-5352c6178e5c\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920570 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts\") pod \"2cd16d27-1038-4aba-89fa-c789dfb631af\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920657 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts\") pod \"b6f6de59-67d3-49c4-983f-5352c6178e5c\" (UID: \"b6f6de59-67d3-49c4-983f-5352c6178e5c\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.920705 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5szh9\" (UniqueName: \"kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9\") pod \"2cd16d27-1038-4aba-89fa-c789dfb631af\" (UID: \"2cd16d27-1038-4aba-89fa-c789dfb631af\") " Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.938243 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts" (OuterVolumeSpecName: "scripts") pod "2cd16d27-1038-4aba-89fa-c789dfb631af" (UID: "2cd16d27-1038-4aba-89fa-c789dfb631af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.938334 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts" (OuterVolumeSpecName: "kube-api-access-4qcts") pod "b6f6de59-67d3-49c4-983f-5352c6178e5c" (UID: "b6f6de59-67d3-49c4-983f-5352c6178e5c"). InnerVolumeSpecName "kube-api-access-4qcts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.956246 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts" (OuterVolumeSpecName: "scripts") pod "b6f6de59-67d3-49c4-983f-5352c6178e5c" (UID: "b6f6de59-67d3-49c4-983f-5352c6178e5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.957403 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9" (OuterVolumeSpecName: "kube-api-access-5szh9") pod "2cd16d27-1038-4aba-89fa-c789dfb631af" (UID: "2cd16d27-1038-4aba-89fa-c789dfb631af"). InnerVolumeSpecName "kube-api-access-5szh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.966999 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6f6de59-67d3-49c4-983f-5352c6178e5c" (UID: "b6f6de59-67d3-49c4-983f-5352c6178e5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.987169 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data" (OuterVolumeSpecName: "config-data") pod "b6f6de59-67d3-49c4-983f-5352c6178e5c" (UID: "b6f6de59-67d3-49c4-983f-5352c6178e5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.996400 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data" (OuterVolumeSpecName: "config-data") pod "2cd16d27-1038-4aba-89fa-c789dfb631af" (UID: "2cd16d27-1038-4aba-89fa-c789dfb631af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:30 crc kubenswrapper[4777]: I0216 22:00:30.996455 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd16d27-1038-4aba-89fa-c789dfb631af" (UID: "2cd16d27-1038-4aba-89fa-c789dfb631af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023774 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023808 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5szh9\" (UniqueName: \"kubernetes.io/projected/2cd16d27-1038-4aba-89fa-c789dfb631af-kube-api-access-5szh9\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023823 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qcts\" (UniqueName: \"kubernetes.io/projected/b6f6de59-67d3-49c4-983f-5352c6178e5c-kube-api-access-4qcts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023835 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023848 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023862 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023874 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f6de59-67d3-49c4-983f-5352c6178e5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.023885 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd16d27-1038-4aba-89fa-c789dfb631af-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.206705 4777 generic.go:334] "Generic (PLEG): container finished" podID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerID="6435afb5bc5b7a6643a652deb5a732e92ac2c0d0a5b48c9e0ab848d9688a7813" exitCode=0 Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.206799 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" event={"ID":"d135dd81-22b2-4c1d-9d7b-bc07f225abbb","Type":"ContainerDied","Data":"6435afb5bc5b7a6643a652deb5a732e92ac2c0d0a5b48c9e0ab848d9688a7813"} Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.209021 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m5pdh" event={"ID":"2cd16d27-1038-4aba-89fa-c789dfb631af","Type":"ContainerDied","Data":"ee70a90230d9fa3db7e6c6a4c06cde5d89d35546b2f484b76e61f4052a8a377f"} Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.209050 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee70a90230d9fa3db7e6c6a4c06cde5d89d35546b2f484b76e61f4052a8a377f" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.209255 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m5pdh" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.221603 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2gskq" event={"ID":"b6f6de59-67d3-49c4-983f-5352c6178e5c","Type":"ContainerDied","Data":"6ef55c18fac68cfefa2e61a6c4620ee906d1141d635cfd0892200a547ce241a3"} Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.221669 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ef55c18fac68cfefa2e61a6c4620ee906d1141d635cfd0892200a547ce241a3" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.228558 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2gskq" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.276900 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.280178 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 16 22:00:31 crc kubenswrapper[4777]: E0216 22:00:31.280560 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f6de59-67d3-49c4-983f-5352c6178e5c" containerName="nova-cell1-conductor-db-sync" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.280579 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f6de59-67d3-49c4-983f-5352c6178e5c" containerName="nova-cell1-conductor-db-sync" Feb 16 22:00:31 crc kubenswrapper[4777]: E0216 22:00:31.280602 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd16d27-1038-4aba-89fa-c789dfb631af" containerName="nova-manage" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.280608 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd16d27-1038-4aba-89fa-c789dfb631af" containerName="nova-manage" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.280815 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd16d27-1038-4aba-89fa-c789dfb631af" containerName="nova-manage" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.280829 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f6de59-67d3-49c4-983f-5352c6178e5c" containerName="nova-cell1-conductor-db-sync" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.281531 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.283254 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.303309 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.357671 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.359796 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-log" containerID="cri-o://e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b" gracePeriod=30 Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.360344 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-api" containerID="cri-o://72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d" gracePeriod=30 Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.365865 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.365958 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.386133 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.441321 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sdhh\" (UniqueName: \"kubernetes.io/projected/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-kube-api-access-6sdhh\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.441372 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.441533 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542493 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542586 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27wwn\" (UniqueName: \"kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542623 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542656 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542689 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.542860 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb\") pod \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\" (UID: \"d135dd81-22b2-4c1d-9d7b-bc07f225abbb\") " Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.543213 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.543362 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sdhh\" (UniqueName: \"kubernetes.io/projected/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-kube-api-access-6sdhh\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.543390 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.547260 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.550208 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn" (OuterVolumeSpecName: "kube-api-access-27wwn") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "kube-api-access-27wwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.562435 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sdhh\" (UniqueName: \"kubernetes.io/projected/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-kube-api-access-6sdhh\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.564001 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf361b5-ee5d-4273-a1b8-4efe5941b4a8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8\") " pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.596809 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.605293 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.608150 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.624673 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.632362 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.638242 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config" (OuterVolumeSpecName: "config") pod "d135dd81-22b2-4c1d-9d7b-bc07f225abbb" (UID: "d135dd81-22b2-4c1d-9d7b-bc07f225abbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645389 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-config\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645419 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645430 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645438 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645447 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27wwn\" (UniqueName: \"kubernetes.io/projected/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-kube-api-access-27wwn\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.645457 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d135dd81-22b2-4c1d-9d7b-bc07f225abbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:31 crc kubenswrapper[4777]: I0216 22:00:31.833700 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.081794 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.241892 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8","Type":"ContainerStarted","Data":"f7b6c1020bdc15227e3d7e8c925ca2658bea92b274f3e4316ac6f43fb8eaae00"} Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.245304 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" event={"ID":"d135dd81-22b2-4c1d-9d7b-bc07f225abbb","Type":"ContainerDied","Data":"046eb717af186b7dfa3e186ef806d6f07ad5563fb2e1f29a5d5cf3797e3cac96"} Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.245363 4777 scope.go:117] "RemoveContainer" containerID="6435afb5bc5b7a6643a652deb5a732e92ac2c0d0a5b48c9e0ab848d9688a7813" Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.245519 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-m28rz" Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.251571 4777 generic.go:334] "Generic (PLEG): container finished" podID="f188aa13-f485-4572-a808-fac0f4e134a2" containerID="e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b" exitCode=143 Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.252085 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerDied","Data":"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b"} Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.275569 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.282884 4777 scope.go:117] "RemoveContainer" containerID="e6cbb7362d54d9d994bf5f123cffa5f8f931039cfea13a48c3a848a8cf0c7aec" Feb 16 22:00:32 crc kubenswrapper[4777]: I0216 22:00:32.284648 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-m28rz"] Feb 16 22:00:33 crc kubenswrapper[4777]: E0216 22:00:33.183884 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:00:33 crc kubenswrapper[4777]: I0216 22:00:33.264764 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bdf361b5-ee5d-4273-a1b8-4efe5941b4a8","Type":"ContainerStarted","Data":"0be671ecd6601ad02f37ba795be5117c5fa521ec2b4aebace092c486efe4c7ae"} Feb 16 22:00:33 crc kubenswrapper[4777]: I0216 22:00:33.264785 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b18de186-7ada-4893-900b-974aac51a495" containerName="nova-scheduler-scheduler" containerID="cri-o://ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" gracePeriod=30 Feb 16 22:00:33 crc kubenswrapper[4777]: I0216 22:00:33.264855 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:33 crc kubenswrapper[4777]: I0216 22:00:33.302506 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.302484845 podStartE2EDuration="2.302484845s" podCreationTimestamp="2026-02-16 22:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:33.282731141 +0000 UTC m=+1353.865232253" watchObservedRunningTime="2026-02-16 22:00:33.302484845 +0000 UTC m=+1353.884985947" Feb 16 22:00:34 crc kubenswrapper[4777]: I0216 22:00:34.199445 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" path="/var/lib/kubelet/pods/d135dd81-22b2-4c1d-9d7b-bc07f225abbb/volumes" Feb 16 22:00:35 crc kubenswrapper[4777]: E0216 22:00:35.622113 4777 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 16 22:00:35 crc kubenswrapper[4777]: E0216 22:00:35.624046 4777 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 16 22:00:35 crc kubenswrapper[4777]: E0216 22:00:35.628240 4777 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 16 22:00:35 crc kubenswrapper[4777]: E0216 22:00:35.628276 4777 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b18de186-7ada-4893-900b-974aac51a495" containerName="nova-scheduler-scheduler" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.300473 4777 generic.go:334] "Generic (PLEG): container finished" podID="b18de186-7ada-4893-900b-974aac51a495" containerID="ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" exitCode=0 Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.300669 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18de186-7ada-4893-900b-974aac51a495","Type":"ContainerDied","Data":"ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad"} Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.426890 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.452352 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle\") pod \"b18de186-7ada-4893-900b-974aac51a495\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.452620 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc55k\" (UniqueName: \"kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k\") pod \"b18de186-7ada-4893-900b-974aac51a495\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.452733 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data\") pod \"b18de186-7ada-4893-900b-974aac51a495\" (UID: \"b18de186-7ada-4893-900b-974aac51a495\") " Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.458637 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k" (OuterVolumeSpecName: "kube-api-access-vc55k") pod "b18de186-7ada-4893-900b-974aac51a495" (UID: "b18de186-7ada-4893-900b-974aac51a495"). InnerVolumeSpecName "kube-api-access-vc55k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.491879 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data" (OuterVolumeSpecName: "config-data") pod "b18de186-7ada-4893-900b-974aac51a495" (UID: "b18de186-7ada-4893-900b-974aac51a495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.513048 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b18de186-7ada-4893-900b-974aac51a495" (UID: "b18de186-7ada-4893-900b-974aac51a495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.556520 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.556565 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc55k\" (UniqueName: \"kubernetes.io/projected/b18de186-7ada-4893-900b-974aac51a495-kube-api-access-vc55k\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:36 crc kubenswrapper[4777]: I0216 22:00:36.556576 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18de186-7ada-4893-900b-974aac51a495-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.181615 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.270449 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs\") pod \"f188aa13-f485-4572-a808-fac0f4e134a2\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.270525 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data\") pod \"f188aa13-f485-4572-a808-fac0f4e134a2\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.270824 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7547n\" (UniqueName: \"kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n\") pod \"f188aa13-f485-4572-a808-fac0f4e134a2\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.270864 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle\") pod \"f188aa13-f485-4572-a808-fac0f4e134a2\" (UID: \"f188aa13-f485-4572-a808-fac0f4e134a2\") " Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.271296 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs" (OuterVolumeSpecName: "logs") pod "f188aa13-f485-4572-a808-fac0f4e134a2" (UID: "f188aa13-f485-4572-a808-fac0f4e134a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.271545 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f188aa13-f485-4572-a808-fac0f4e134a2-logs\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.291244 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n" (OuterVolumeSpecName: "kube-api-access-7547n") pod "f188aa13-f485-4572-a808-fac0f4e134a2" (UID: "f188aa13-f485-4572-a808-fac0f4e134a2"). InnerVolumeSpecName "kube-api-access-7547n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.341111 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f188aa13-f485-4572-a808-fac0f4e134a2" (UID: "f188aa13-f485-4572-a808-fac0f4e134a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.344975 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data" (OuterVolumeSpecName: "config-data") pod "f188aa13-f485-4572-a808-fac0f4e134a2" (UID: "f188aa13-f485-4572-a808-fac0f4e134a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.373606 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.373636 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f188aa13-f485-4572-a808-fac0f4e134a2-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.373646 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7547n\" (UniqueName: \"kubernetes.io/projected/f188aa13-f485-4572-a808-fac0f4e134a2-kube-api-access-7547n\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.375136 4777 generic.go:334] "Generic (PLEG): container finished" podID="f188aa13-f485-4572-a808-fac0f4e134a2" containerID="72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d" exitCode=0 Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.375196 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerDied","Data":"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d"} Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.375225 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f188aa13-f485-4572-a808-fac0f4e134a2","Type":"ContainerDied","Data":"1ebc8ab7d534768f57be6f4890692cce60fbf206cd887bb35d2953e4ceb1ce85"} Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.375241 4777 scope.go:117] "RemoveContainer" containerID="72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.375353 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.395293 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b18de186-7ada-4893-900b-974aac51a495","Type":"ContainerDied","Data":"759832d7b93c75f12f7859ec3ec3bd780fd2dfb3c7e6d0d4fda1435f10942fc3"} Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.395367 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.437222 4777 scope.go:117] "RemoveContainer" containerID="e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.466849 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.480819 4777 scope.go:117] "RemoveContainer" containerID="72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.483425 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d\": container with ID starting with 72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d not found: ID does not exist" containerID="72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.483507 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d"} err="failed to get container status \"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d\": rpc error: code = NotFound desc = could not find container \"72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d\": container with ID starting with 72b152b1b7348c3cf78caed1353437a94587eedcdc2d9853a377e1c2125c3f3d not found: ID does not exist" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.483933 4777 scope.go:117] "RemoveContainer" containerID="e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.493936 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b\": container with ID starting with e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b not found: ID does not exist" containerID="e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.494007 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b"} err="failed to get container status \"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b\": rpc error: code = NotFound desc = could not find container \"e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b\": container with ID starting with e61acd36e291098eacc0bf1affa73650132114a5a8d08e1b380f0e68ed9b4b1b not found: ID does not exist" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.494072 4777 scope.go:117] "RemoveContainer" containerID="ff397ae9c0033cfd5b74d38878b395a010ab9ba2761fe616dd2bebc03427ffad" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.499321 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.539455 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.540004 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18de186-7ada-4893-900b-974aac51a495" containerName="nova-scheduler-scheduler" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540028 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18de186-7ada-4893-900b-974aac51a495" containerName="nova-scheduler-scheduler" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.540051 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="dnsmasq-dns" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540059 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="dnsmasq-dns" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.540072 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="init" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540078 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="init" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.540099 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-api" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540105 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-api" Feb 16 22:00:37 crc kubenswrapper[4777]: E0216 22:00:37.540118 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-log" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540126 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-log" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540316 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-api" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540330 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="d135dd81-22b2-4c1d-9d7b-bc07f225abbb" containerName="dnsmasq-dns" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540342 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" containerName="nova-api-log" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.540353 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18de186-7ada-4893-900b-974aac51a495" containerName="nova-scheduler-scheduler" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.543556 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.546969 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.551126 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.567799 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.581475 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.592633 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.594300 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.595821 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.596308 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.596404 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhnq6\" (UniqueName: \"kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.596509 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.602892 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.604976 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.698931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.698983 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699023 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699068 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhnq6\" (UniqueName: \"kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699085 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699209 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvzjh\" (UniqueName: \"kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699400 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.699569 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.702977 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.703576 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.721562 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhnq6\" (UniqueName: \"kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6\") pod \"nova-api-0\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.800780 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvzjh\" (UniqueName: \"kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.800875 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.800972 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.804360 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.804557 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.816735 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvzjh\" (UniqueName: \"kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh\") pod \"nova-scheduler-0\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " pod="openstack/nova-scheduler-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.882520 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:00:37 crc kubenswrapper[4777]: I0216 22:00:37.920526 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:00:38 crc kubenswrapper[4777]: I0216 22:00:38.194919 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18de186-7ada-4893-900b-974aac51a495" path="/var/lib/kubelet/pods/b18de186-7ada-4893-900b-974aac51a495/volumes" Feb 16 22:00:38 crc kubenswrapper[4777]: I0216 22:00:38.195858 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f188aa13-f485-4572-a808-fac0f4e134a2" path="/var/lib/kubelet/pods/f188aa13-f485-4572-a808-fac0f4e134a2/volumes" Feb 16 22:00:38 crc kubenswrapper[4777]: I0216 22:00:38.380376 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:00:38 crc kubenswrapper[4777]: I0216 22:00:38.405985 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerStarted","Data":"1a2fe4be23b3882f8ecbcb52329409378a3da9909c898af3abc2fc08dae53941"} Feb 16 22:00:38 crc kubenswrapper[4777]: I0216 22:00:38.494243 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:00:38 crc kubenswrapper[4777]: W0216 22:00:38.516828 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1602cea5_c845_470a_812e_6c1f22e201be.slice/crio-413bca43c2e561148a417b4e1df6cc56f6cafcb517bb35c5eb3195b0614e9d50 WatchSource:0}: Error finding container 413bca43c2e561148a417b4e1df6cc56f6cafcb517bb35c5eb3195b0614e9d50: Status 404 returned error can't find the container with id 413bca43c2e561148a417b4e1df6cc56f6cafcb517bb35c5eb3195b0614e9d50 Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.421276 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerStarted","Data":"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4"} Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.421649 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerStarted","Data":"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d"} Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.425220 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1602cea5-c845-470a-812e-6c1f22e201be","Type":"ContainerStarted","Data":"47282fb44dcc1039dadf7c3d08f3d6d5c5632e82239bf7ab7a57ea2267ab6439"} Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.425276 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1602cea5-c845-470a-812e-6c1f22e201be","Type":"ContainerStarted","Data":"413bca43c2e561148a417b4e1df6cc56f6cafcb517bb35c5eb3195b0614e9d50"} Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.448506 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.448487406 podStartE2EDuration="2.448487406s" podCreationTimestamp="2026-02-16 22:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:39.444292489 +0000 UTC m=+1360.026793631" watchObservedRunningTime="2026-02-16 22:00:39.448487406 +0000 UTC m=+1360.030988528" Feb 16 22:00:39 crc kubenswrapper[4777]: I0216 22:00:39.476469 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.476445869 podStartE2EDuration="2.476445869s" podCreationTimestamp="2026-02-16 22:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:39.462051856 +0000 UTC m=+1360.044552988" watchObservedRunningTime="2026-02-16 22:00:39.476445869 +0000 UTC m=+1360.058946991" Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.651583 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.651951 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.652000 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.652700 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.652775 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd" gracePeriod=600 Feb 16 22:00:41 crc kubenswrapper[4777]: I0216 22:00:41.657020 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.471423 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd" exitCode=0 Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.471470 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd"} Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.471762 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2"} Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.471785 4777 scope.go:117] "RemoveContainer" containerID="09c5b65387f90ae361507692bc923fcadef41388ff0532f5cec9f60ce0409e3f" Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.613805 4777 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 16 22:00:42 crc kubenswrapper[4777]: I0216 22:00:42.920900 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 16 22:00:46 crc kubenswrapper[4777]: E0216 22:00:46.187553 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.425878 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.525890 4777 generic.go:334] "Generic (PLEG): container finished" podID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerID="7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465" exitCode=137 Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.525929 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerDied","Data":"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465"} Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.525954 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"267fa1e2-1ed3-4c2e-9973-1372be71330a","Type":"ContainerDied","Data":"446922b5ed74b87828debe8344b2f3bde15c74baca6729bc91974d2741c96fac"} Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.525970 4777 scope.go:117] "RemoveContainer" containerID="7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.526094 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543554 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543623 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543657 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543682 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543725 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543884 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.543927 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.544071 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jwnt\" (UniqueName: \"kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.544338 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.544601 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.544724 4777 scope.go:117] "RemoveContainer" containerID="e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.545193 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.545218 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/267fa1e2-1ed3-4c2e-9973-1372be71330a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.564398 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts" (OuterVolumeSpecName: "scripts") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.566383 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt" (OuterVolumeSpecName: "kube-api-access-7jwnt") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "kube-api-access-7jwnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.575460 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.596569 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.641925 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data" (OuterVolumeSpecName: "config-data") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.645494 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.646681 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") pod \"267fa1e2-1ed3-4c2e-9973-1372be71330a\" (UID: \"267fa1e2-1ed3-4c2e-9973-1372be71330a\") " Feb 16 22:00:47 crc kubenswrapper[4777]: W0216 22:00:47.647312 4777 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/267fa1e2-1ed3-4c2e-9973-1372be71330a/volumes/kubernetes.io~secret/combined-ca-bundle Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.647331 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "267fa1e2-1ed3-4c2e-9973-1372be71330a" (UID: "267fa1e2-1ed3-4c2e-9973-1372be71330a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.649059 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.649089 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.649100 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.649114 4777 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.649122 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jwnt\" (UniqueName: \"kubernetes.io/projected/267fa1e2-1ed3-4c2e-9973-1372be71330a-kube-api-access-7jwnt\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.671521 4777 scope.go:117] "RemoveContainer" containerID="7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.693766 4777 scope.go:117] "RemoveContainer" containerID="e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.722931 4777 scope.go:117] "RemoveContainer" containerID="7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.732342 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465\": container with ID starting with 7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465 not found: ID does not exist" containerID="7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.732400 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465"} err="failed to get container status \"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465\": rpc error: code = NotFound desc = could not find container \"7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465\": container with ID starting with 7a3fcb7b946a0f7d8371c2d4cdc31554950afec33158bca1b1757735d45da465 not found: ID does not exist" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.732429 4777 scope.go:117] "RemoveContainer" containerID="e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.732865 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44\": container with ID starting with e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44 not found: ID does not exist" containerID="e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.732903 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44"} err="failed to get container status \"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44\": rpc error: code = NotFound desc = could not find container \"e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44\": container with ID starting with e902a27602e46b06c4c3fb8fa7d0e579a9945655f659c8f4f4c38813ed463d44 not found: ID does not exist" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.732918 4777 scope.go:117] "RemoveContainer" containerID="7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.733486 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc\": container with ID starting with 7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc not found: ID does not exist" containerID="7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.733536 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc"} err="failed to get container status \"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc\": rpc error: code = NotFound desc = could not find container \"7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc\": container with ID starting with 7d2b2da2b813c43f3d5fa1029e0e772938b727c884f5ac1db239bc94c18b75dc not found: ID does not exist" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.733570 4777 scope.go:117] "RemoveContainer" containerID="e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.733926 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443\": container with ID starting with e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443 not found: ID does not exist" containerID="e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.733982 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443"} err="failed to get container status \"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443\": rpc error: code = NotFound desc = could not find container \"e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443\": container with ID starting with e015a78f74aac5539e921cdc4a715e6ea7ebc4c6054b9df1b0b8f26a3fd8a443 not found: ID does not exist" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.751753 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267fa1e2-1ed3-4c2e-9973-1372be71330a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.875838 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.883950 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.884041 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.921342 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.930801 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.967806 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.968228 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="sg-core" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968245 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="sg-core" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.968278 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-central-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968285 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-central-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.968292 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="proxy-httpd" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968297 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="proxy-httpd" Feb 16 22:00:47 crc kubenswrapper[4777]: E0216 22:00:47.968305 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-notification-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968312 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-notification-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968481 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="proxy-httpd" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968499 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-central-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968507 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="ceilometer-notification-agent" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.968523 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" containerName="sg-core" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.970268 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.970309 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.974227 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.974401 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.974467 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 16 22:00:47 crc kubenswrapper[4777]: I0216 22:00:47.977475 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159188 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159251 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159486 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159544 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159601 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrlb\" (UniqueName: \"kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159734 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159851 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.159915 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.191657 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267fa1e2-1ed3-4c2e-9973-1372be71330a" path="/var/lib/kubelet/pods/267fa1e2-1ed3-4c2e-9973-1372be71330a/volumes" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262537 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262642 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262747 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262818 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262879 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262920 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.262945 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clrlb\" (UniqueName: \"kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.263006 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.263219 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.263473 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.268459 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.268649 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.268763 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.281054 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.281450 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.293862 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrlb\" (UniqueName: \"kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb\") pod \"ceilometer-0\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.573585 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.588530 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.966972 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 22:00:48 crc kubenswrapper[4777]: I0216 22:00:48.966998 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 16 22:00:49 crc kubenswrapper[4777]: W0216 22:00:49.203296 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9972ceba_f718_4579_9eac_0875e4b50b09.slice/crio-4f9d6ae29fdd9ceebe5ba93ad6e4eee82ceec0523dd728974d3a607a95c9a060 WatchSource:0}: Error finding container 4f9d6ae29fdd9ceebe5ba93ad6e4eee82ceec0523dd728974d3a607a95c9a060: Status 404 returned error can't find the container with id 4f9d6ae29fdd9ceebe5ba93ad6e4eee82ceec0523dd728974d3a607a95c9a060 Feb 16 22:00:49 crc kubenswrapper[4777]: I0216 22:00:49.205501 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:00:49 crc kubenswrapper[4777]: I0216 22:00:49.549920 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerStarted","Data":"4f9d6ae29fdd9ceebe5ba93ad6e4eee82ceec0523dd728974d3a607a95c9a060"} Feb 16 22:00:50 crc kubenswrapper[4777]: I0216 22:00:50.562485 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerStarted","Data":"97bfa750fd87b00358112c7008ccf5dea33a34e8fd107edeffbb7d248458a4b8"} Feb 16 22:00:51 crc kubenswrapper[4777]: I0216 22:00:51.576640 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerStarted","Data":"1c51640e3f8f958de676dd5a3410ace1d2d68442dc4b5279f7a4c1b59691a63a"} Feb 16 22:00:51 crc kubenswrapper[4777]: I0216 22:00:51.577324 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerStarted","Data":"8d6e93faaa514c5344efede0ae010a928105766105501eda75397198200a312c"} Feb 16 22:00:53 crc kubenswrapper[4777]: I0216 22:00:53.610239 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerStarted","Data":"9614c62ac2cfa4ded35e869e9662e2cacf130dc0379c7989dfaf39b7778a7982"} Feb 16 22:00:53 crc kubenswrapper[4777]: I0216 22:00:53.610938 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 22:00:53 crc kubenswrapper[4777]: I0216 22:00:53.646334 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.214162971 podStartE2EDuration="6.646314836s" podCreationTimestamp="2026-02-16 22:00:47 +0000 UTC" firstStartedPulling="2026-02-16 22:00:49.206013125 +0000 UTC m=+1369.788514227" lastFinishedPulling="2026-02-16 22:00:52.63816497 +0000 UTC m=+1373.220666092" observedRunningTime="2026-02-16 22:00:53.64148509 +0000 UTC m=+1374.223986242" watchObservedRunningTime="2026-02-16 22:00:53.646314836 +0000 UTC m=+1374.228815948" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.645642 4777 generic.go:334] "Generic (PLEG): container finished" podID="b43568b7-224c-4ace-b602-c0c7008ab829" containerID="7f86f61bf328e1461a3a3cb5ab33db2178c5b3b20efc8fd3bf80d8b356ba838a" exitCode=137 Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.645759 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b43568b7-224c-4ace-b602-c0c7008ab829","Type":"ContainerDied","Data":"7f86f61bf328e1461a3a3cb5ab33db2178c5b3b20efc8fd3bf80d8b356ba838a"} Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.646141 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b43568b7-224c-4ace-b602-c0c7008ab829","Type":"ContainerDied","Data":"6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16"} Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.646163 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f035504184ad6a4a51838ca46d70183cd18ec25fe94b4924034378211835c16" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.656135 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.660365 4777 generic.go:334] "Generic (PLEG): container finished" podID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerID="5bf312845e6a092d578609bd2019e50ffaff95942213bc94877e84e270983af5" exitCode=137 Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.660398 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerDied","Data":"5bf312845e6a092d578609bd2019e50ffaff95942213bc94877e84e270983af5"} Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.660419 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c80239e6-1745-49a4-87ba-f7f835bab0fc","Type":"ContainerDied","Data":"0e8f5905c9fe0cca8e3b62f0354b0869ee43e6eb18eb54de3e1b16afc9496c9e"} Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.660446 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8f5905c9fe0cca8e3b62f0354b0869ee43e6eb18eb54de3e1b16afc9496c9e" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.662812 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749103 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle\") pod \"c80239e6-1745-49a4-87ba-f7f835bab0fc\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749177 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncw55\" (UniqueName: \"kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55\") pod \"b43568b7-224c-4ace-b602-c0c7008ab829\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749211 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle\") pod \"b43568b7-224c-4ace-b602-c0c7008ab829\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749290 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data\") pod \"c80239e6-1745-49a4-87ba-f7f835bab0fc\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749356 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdwrk\" (UniqueName: \"kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk\") pod \"c80239e6-1745-49a4-87ba-f7f835bab0fc\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749483 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data\") pod \"b43568b7-224c-4ace-b602-c0c7008ab829\" (UID: \"b43568b7-224c-4ace-b602-c0c7008ab829\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.749509 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs\") pod \"c80239e6-1745-49a4-87ba-f7f835bab0fc\" (UID: \"c80239e6-1745-49a4-87ba-f7f835bab0fc\") " Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.750903 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs" (OuterVolumeSpecName: "logs") pod "c80239e6-1745-49a4-87ba-f7f835bab0fc" (UID: "c80239e6-1745-49a4-87ba-f7f835bab0fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.755765 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55" (OuterVolumeSpecName: "kube-api-access-ncw55") pod "b43568b7-224c-4ace-b602-c0c7008ab829" (UID: "b43568b7-224c-4ace-b602-c0c7008ab829"). InnerVolumeSpecName "kube-api-access-ncw55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.758506 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk" (OuterVolumeSpecName: "kube-api-access-zdwrk") pod "c80239e6-1745-49a4-87ba-f7f835bab0fc" (UID: "c80239e6-1745-49a4-87ba-f7f835bab0fc"). InnerVolumeSpecName "kube-api-access-zdwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.782800 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data" (OuterVolumeSpecName: "config-data") pod "b43568b7-224c-4ace-b602-c0c7008ab829" (UID: "b43568b7-224c-4ace-b602-c0c7008ab829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.785922 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b43568b7-224c-4ace-b602-c0c7008ab829" (UID: "b43568b7-224c-4ace-b602-c0c7008ab829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.792689 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data" (OuterVolumeSpecName: "config-data") pod "c80239e6-1745-49a4-87ba-f7f835bab0fc" (UID: "c80239e6-1745-49a4-87ba-f7f835bab0fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.811234 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c80239e6-1745-49a4-87ba-f7f835bab0fc" (UID: "c80239e6-1745-49a4-87ba-f7f835bab0fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.851903 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncw55\" (UniqueName: \"kubernetes.io/projected/b43568b7-224c-4ace-b602-c0c7008ab829-kube-api-access-ncw55\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.851938 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.851948 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.851957 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdwrk\" (UniqueName: \"kubernetes.io/projected/c80239e6-1745-49a4-87ba-f7f835bab0fc-kube-api-access-zdwrk\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.851966 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b43568b7-224c-4ace-b602-c0c7008ab829-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.852004 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80239e6-1745-49a4-87ba-f7f835bab0fc-logs\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:55 crc kubenswrapper[4777]: I0216 22:00:55.852014 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80239e6-1745-49a4-87ba-f7f835bab0fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.698274 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.698319 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.724796 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.736583 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.745524 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.753112 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.760501 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: E0216 22:00:56.760898 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-metadata" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.760914 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-metadata" Feb 16 22:00:56 crc kubenswrapper[4777]: E0216 22:00:56.760923 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43568b7-224c-4ace-b602-c0c7008ab829" containerName="nova-cell1-novncproxy-novncproxy" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.760930 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43568b7-224c-4ace-b602-c0c7008ab829" containerName="nova-cell1-novncproxy-novncproxy" Feb 16 22:00:56 crc kubenswrapper[4777]: E0216 22:00:56.760939 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-log" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.760946 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-log" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.761118 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-log" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.761137 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" containerName="nova-metadata-metadata" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.761153 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43568b7-224c-4ace-b602-c0c7008ab829" containerName="nova-cell1-novncproxy-novncproxy" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.761823 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.764402 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.765018 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.765379 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.789180 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.792583 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.806054 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.806330 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.808242 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.819788 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873013 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873082 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873127 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873173 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873318 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873399 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whtnc\" (UniqueName: \"kubernetes.io/projected/ab854724-b71f-40d0-ac17-43ddc095c3ac-kube-api-access-whtnc\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873460 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvw7\" (UniqueName: \"kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873501 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873541 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.873582 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975633 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975707 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975750 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975781 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975811 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975894 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whtnc\" (UniqueName: \"kubernetes.io/projected/ab854724-b71f-40d0-ac17-43ddc095c3ac-kube-api-access-whtnc\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975929 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phvw7\" (UniqueName: \"kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975950 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.975978 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.976896 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.983778 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.984551 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.985117 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.987193 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.994772 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.995460 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab854724-b71f-40d0-ac17-43ddc095c3ac-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.996823 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.997382 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvw7\" (UniqueName: \"kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7\") pod \"nova-metadata-0\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " pod="openstack/nova-metadata-0" Feb 16 22:00:56 crc kubenswrapper[4777]: I0216 22:00:56.998586 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whtnc\" (UniqueName: \"kubernetes.io/projected/ab854724-b71f-40d0-ac17-43ddc095c3ac-kube-api-access-whtnc\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab854724-b71f-40d0-ac17-43ddc095c3ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.086574 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.110156 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.589634 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.600291 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 16 22:00:57 crc kubenswrapper[4777]: W0216 22:00:57.600797 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab854724_b71f_40d0_ac17_43ddc095c3ac.slice/crio-f2c9000a7de359f98fcf16faaacf4cf2cf9e0fa92f063980a42d4b6a627e80f8 WatchSource:0}: Error finding container f2c9000a7de359f98fcf16faaacf4cf2cf9e0fa92f063980a42d4b6a627e80f8: Status 404 returned error can't find the container with id f2c9000a7de359f98fcf16faaacf4cf2cf9e0fa92f063980a42d4b6a627e80f8 Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.708869 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerStarted","Data":"8b25a2352e191b32986c3893391ddd94f40620c6f8d7913d1aafbefe2b6f9ab8"} Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.710457 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab854724-b71f-40d0-ac17-43ddc095c3ac","Type":"ContainerStarted","Data":"f2c9000a7de359f98fcf16faaacf4cf2cf9e0fa92f063980a42d4b6a627e80f8"} Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.888081 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.888649 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.890084 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 16 22:00:57 crc kubenswrapper[4777]: I0216 22:00:57.893735 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 16 22:00:58 crc kubenswrapper[4777]: E0216 22:00:58.183393 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.192053 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43568b7-224c-4ace-b602-c0c7008ab829" path="/var/lib/kubelet/pods/b43568b7-224c-4ace-b602-c0c7008ab829/volumes" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.192598 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80239e6-1745-49a4-87ba-f7f835bab0fc" path="/var/lib/kubelet/pods/c80239e6-1745-49a4-87ba-f7f835bab0fc/volumes" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.723467 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerStarted","Data":"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832"} Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.723516 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerStarted","Data":"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d"} Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.727900 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab854724-b71f-40d0-ac17-43ddc095c3ac","Type":"ContainerStarted","Data":"6b23e4d0b050d85d00b7a83df238a72aa454943b5747ea649c3249ede8bf4f20"} Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.728379 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.731774 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.751470 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.751446466 podStartE2EDuration="2.751446466s" podCreationTimestamp="2026-02-16 22:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:58.74302373 +0000 UTC m=+1379.325524872" watchObservedRunningTime="2026-02-16 22:00:58.751446466 +0000 UTC m=+1379.333947608" Feb 16 22:00:58 crc kubenswrapper[4777]: I0216 22:00:58.783980 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.783949956 podStartE2EDuration="2.783949956s" podCreationTimestamp="2026-02-16 22:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:00:58.759300216 +0000 UTC m=+1379.341801318" watchObservedRunningTime="2026-02-16 22:00:58.783949956 +0000 UTC m=+1379.366451088" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:58.994137 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qsg2h"] Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.004648 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.011826 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qsg2h"] Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128566 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128674 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-config\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128792 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128896 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njs7g\" (UniqueName: \"kubernetes.io/projected/368d5445-0b3c-4ea7-b7d8-674eaca061df-kube-api-access-njs7g\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128927 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.128986 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230409 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njs7g\" (UniqueName: \"kubernetes.io/projected/368d5445-0b3c-4ea7-b7d8-674eaca061df-kube-api-access-njs7g\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230455 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230490 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230548 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230579 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-config\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.230628 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.231790 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.231796 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.231943 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-config\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.232080 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.232340 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368d5445-0b3c-4ea7-b7d8-674eaca061df-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.250773 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njs7g\" (UniqueName: \"kubernetes.io/projected/368d5445-0b3c-4ea7-b7d8-674eaca061df-kube-api-access-njs7g\") pod \"dnsmasq-dns-89c5cd4d5-qsg2h\" (UID: \"368d5445-0b3c-4ea7-b7d8-674eaca061df\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.329328 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:00:59 crc kubenswrapper[4777]: I0216 22:00:59.863298 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qsg2h"] Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.149245 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29521321-ljtlr"] Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.150921 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.152766 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29521321-ljtlr"] Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.281859 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.281968 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.282007 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.282237 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nslz\" (UniqueName: \"kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.384596 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.384654 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.384682 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.384770 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nslz\" (UniqueName: \"kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.389465 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.389864 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.391504 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.399256 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nslz\" (UniqueName: \"kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz\") pod \"keystone-cron-29521321-ljtlr\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.548385 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.746611 4777 generic.go:334] "Generic (PLEG): container finished" podID="368d5445-0b3c-4ea7-b7d8-674eaca061df" containerID="f4474c4ec69536f2d8179803aa14477f76b6a907fb1d7e4c17e3ca1ce602d21b" exitCode=0 Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.746695 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" event={"ID":"368d5445-0b3c-4ea7-b7d8-674eaca061df","Type":"ContainerDied","Data":"f4474c4ec69536f2d8179803aa14477f76b6a907fb1d7e4c17e3ca1ce602d21b"} Feb 16 22:01:00 crc kubenswrapper[4777]: I0216 22:01:00.746733 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" event={"ID":"368d5445-0b3c-4ea7-b7d8-674eaca061df","Type":"ContainerStarted","Data":"d0168bd87e4f0e293fc7f23330a36f31cb158751e9b7c4f26021d28917b3a412"} Feb 16 22:01:01 crc kubenswrapper[4777]: W0216 22:01:01.045571 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf98b7fd0_6b82_4bba_ac85_6ee9afa204aa.slice/crio-f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b WatchSource:0}: Error finding container f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b: Status 404 returned error can't find the container with id f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.051014 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29521321-ljtlr"] Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.481258 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.481819 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-central-agent" containerID="cri-o://97bfa750fd87b00358112c7008ccf5dea33a34e8fd107edeffbb7d248458a4b8" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.481949 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="sg-core" containerID="cri-o://1c51640e3f8f958de676dd5a3410ace1d2d68442dc4b5279f7a4c1b59691a63a" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.481946 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="proxy-httpd" containerID="cri-o://9614c62ac2cfa4ded35e869e9662e2cacf130dc0379c7989dfaf39b7778a7982" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.482200 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-notification-agent" containerID="cri-o://8d6e93faaa514c5344efede0ae010a928105766105501eda75397198200a312c" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.655377 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.756005 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29521321-ljtlr" event={"ID":"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa","Type":"ContainerStarted","Data":"6a790387d26407b07d3ccbf0feba30940a2e8ec0f1992290d12c41698293942f"} Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.756053 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29521321-ljtlr" event={"ID":"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa","Type":"ContainerStarted","Data":"f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b"} Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.758595 4777 generic.go:334] "Generic (PLEG): container finished" podID="9972ceba-f718-4579-9eac-0875e4b50b09" containerID="9614c62ac2cfa4ded35e869e9662e2cacf130dc0379c7989dfaf39b7778a7982" exitCode=0 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.758616 4777 generic.go:334] "Generic (PLEG): container finished" podID="9972ceba-f718-4579-9eac-0875e4b50b09" containerID="1c51640e3f8f958de676dd5a3410ace1d2d68442dc4b5279f7a4c1b59691a63a" exitCode=2 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.758681 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerDied","Data":"9614c62ac2cfa4ded35e869e9662e2cacf130dc0379c7989dfaf39b7778a7982"} Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.758742 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerDied","Data":"1c51640e3f8f958de676dd5a3410ace1d2d68442dc4b5279f7a4c1b59691a63a"} Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.760937 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" event={"ID":"368d5445-0b3c-4ea7-b7d8-674eaca061df","Type":"ContainerStarted","Data":"a7f13d7fd0a7195729a3de142b23369b5ddd6212ac6c41f39fe317e1924ceef9"} Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.761031 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-log" containerID="cri-o://b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.761084 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-api" containerID="cri-o://9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4" gracePeriod=30 Feb 16 22:01:01 crc kubenswrapper[4777]: I0216 22:01:01.776378 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29521321-ljtlr" podStartSLOduration=1.7763654450000002 podStartE2EDuration="1.776365445s" podCreationTimestamp="2026-02-16 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:01.77120778 +0000 UTC m=+1382.353708882" watchObservedRunningTime="2026-02-16 22:01:01.776365445 +0000 UTC m=+1382.358866547" Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.087584 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.111133 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.111180 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.787512 4777 generic.go:334] "Generic (PLEG): container finished" podID="9972ceba-f718-4579-9eac-0875e4b50b09" containerID="8d6e93faaa514c5344efede0ae010a928105766105501eda75397198200a312c" exitCode=0 Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.787748 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerDied","Data":"8d6e93faaa514c5344efede0ae010a928105766105501eda75397198200a312c"} Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.787806 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerDied","Data":"97bfa750fd87b00358112c7008ccf5dea33a34e8fd107edeffbb7d248458a4b8"} Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.787773 4777 generic.go:334] "Generic (PLEG): container finished" podID="9972ceba-f718-4579-9eac-0875e4b50b09" containerID="97bfa750fd87b00358112c7008ccf5dea33a34e8fd107edeffbb7d248458a4b8" exitCode=0 Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.792464 4777 generic.go:334] "Generic (PLEG): container finished" podID="05b6aced-df73-44a5-9338-553f28ad707b" containerID="b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d" exitCode=143 Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.793117 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerDied","Data":"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d"} Feb 16 22:01:02 crc kubenswrapper[4777]: I0216 22:01:02.793940 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.063668 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.092940 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" podStartSLOduration=5.092922817 podStartE2EDuration="5.092922817s" podCreationTimestamp="2026-02-16 22:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:01.800596064 +0000 UTC m=+1382.383097166" watchObservedRunningTime="2026-02-16 22:01:03.092922817 +0000 UTC m=+1383.675423919" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243504 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243578 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243654 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243734 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243785 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243911 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrlb\" (UniqueName: \"kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.243974 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.244004 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data\") pod \"9972ceba-f718-4579-9eac-0875e4b50b09\" (UID: \"9972ceba-f718-4579-9eac-0875e4b50b09\") " Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.244680 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.244910 4777 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.245070 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.248947 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts" (OuterVolumeSpecName: "scripts") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.249954 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb" (OuterVolumeSpecName: "kube-api-access-clrlb") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "kube-api-access-clrlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.273114 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.296093 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.330016 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347568 4777 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9972ceba-f718-4579-9eac-0875e4b50b09-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347613 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clrlb\" (UniqueName: \"kubernetes.io/projected/9972ceba-f718-4579-9eac-0875e4b50b09-kube-api-access-clrlb\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347629 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347643 4777 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347654 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.347666 4777 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.353729 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data" (OuterVolumeSpecName: "config-data") pod "9972ceba-f718-4579-9eac-0875e4b50b09" (UID: "9972ceba-f718-4579-9eac-0875e4b50b09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.449834 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9972ceba-f718-4579-9eac-0875e4b50b09-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.807181 4777 generic.go:334] "Generic (PLEG): container finished" podID="f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" containerID="6a790387d26407b07d3ccbf0feba30940a2e8ec0f1992290d12c41698293942f" exitCode=0 Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.807546 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29521321-ljtlr" event={"ID":"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa","Type":"ContainerDied","Data":"6a790387d26407b07d3ccbf0feba30940a2e8ec0f1992290d12c41698293942f"} Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.830058 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.831183 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9972ceba-f718-4579-9eac-0875e4b50b09","Type":"ContainerDied","Data":"4f9d6ae29fdd9ceebe5ba93ad6e4eee82ceec0523dd728974d3a607a95c9a060"} Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.831256 4777 scope.go:117] "RemoveContainer" containerID="9614c62ac2cfa4ded35e869e9662e2cacf130dc0379c7989dfaf39b7778a7982" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.878010 4777 scope.go:117] "RemoveContainer" containerID="1c51640e3f8f958de676dd5a3410ace1d2d68442dc4b5279f7a4c1b59691a63a" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.895000 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.908901 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.935659 4777 scope.go:117] "RemoveContainer" containerID="8d6e93faaa514c5344efede0ae010a928105766105501eda75397198200a312c" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.940385 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:03 crc kubenswrapper[4777]: E0216 22:01:03.940849 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="sg-core" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.940864 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="sg-core" Feb 16 22:01:03 crc kubenswrapper[4777]: E0216 22:01:03.940886 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="proxy-httpd" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.940892 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="proxy-httpd" Feb 16 22:01:03 crc kubenswrapper[4777]: E0216 22:01:03.940915 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-notification-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.940921 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-notification-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: E0216 22:01:03.940933 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-central-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.940939 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-central-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.941123 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-notification-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.941138 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="sg-core" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.941150 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="ceilometer-central-agent" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.941337 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" containerName="proxy-httpd" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.943340 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.945695 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.949266 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.949421 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.950009 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:03 crc kubenswrapper[4777]: I0216 22:01:03.974157 4777 scope.go:117] "RemoveContainer" containerID="97bfa750fd87b00358112c7008ccf5dea33a34e8fd107edeffbb7d248458a4b8" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062105 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062181 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-run-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062273 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-log-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062422 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-config-data\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062473 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062656 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-scripts\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062688 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.062795 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb8sh\" (UniqueName: \"kubernetes.io/projected/fbefa3e5-4388-4b9e-9c93-6580727d021d-kube-api-access-rb8sh\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164361 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-log-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164501 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-config-data\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164543 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164609 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-scripts\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164629 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164672 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb8sh\" (UniqueName: \"kubernetes.io/projected/fbefa3e5-4388-4b9e-9c93-6580727d021d-kube-api-access-rb8sh\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164706 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164807 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-run-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.164916 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-log-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.165316 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbefa3e5-4388-4b9e-9c93-6580727d021d-run-httpd\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.169768 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.171502 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-scripts\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.171786 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.179191 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.180533 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbefa3e5-4388-4b9e-9c93-6580727d021d-config-data\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.184545 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb8sh\" (UniqueName: \"kubernetes.io/projected/fbefa3e5-4388-4b9e-9c93-6580727d021d-kube-api-access-rb8sh\") pod \"ceilometer-0\" (UID: \"fbefa3e5-4388-4b9e-9c93-6580727d021d\") " pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.200794 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9972ceba-f718-4579-9eac-0875e4b50b09" path="/var/lib/kubelet/pods/9972ceba-f718-4579-9eac-0875e4b50b09/volumes" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.274594 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.813247 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 16 22:01:04 crc kubenswrapper[4777]: I0216 22:01:04.851816 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbefa3e5-4388-4b9e-9c93-6580727d021d","Type":"ContainerStarted","Data":"76b8d5ac1db70b729d96c7c25b1ed0fa42176f1348419853f9934122bd830716"} Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.318495 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.389873 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.394064 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nslz\" (UniqueName: \"kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz\") pod \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.394173 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys\") pod \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.394253 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data\") pod \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.394282 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle\") pod \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\" (UID: \"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.399948 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz" (OuterVolumeSpecName: "kube-api-access-8nslz") pod "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" (UID: "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa"). InnerVolumeSpecName "kube-api-access-8nslz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.402764 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" (UID: "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.425571 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" (UID: "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.481985 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data" (OuterVolumeSpecName: "config-data") pod "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" (UID: "f98b7fd0-6b82-4bba-ac85-6ee9afa204aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.496489 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle\") pod \"05b6aced-df73-44a5-9338-553f28ad707b\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.496566 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs\") pod \"05b6aced-df73-44a5-9338-553f28ad707b\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.496603 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data\") pod \"05b6aced-df73-44a5-9338-553f28ad707b\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.496644 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhnq6\" (UniqueName: \"kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6\") pod \"05b6aced-df73-44a5-9338-553f28ad707b\" (UID: \"05b6aced-df73-44a5-9338-553f28ad707b\") " Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.496938 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs" (OuterVolumeSpecName: "logs") pod "05b6aced-df73-44a5-9338-553f28ad707b" (UID: "05b6aced-df73-44a5-9338-553f28ad707b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.497344 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.497363 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05b6aced-df73-44a5-9338-553f28ad707b-logs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.497374 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nslz\" (UniqueName: \"kubernetes.io/projected/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-kube-api-access-8nslz\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.497385 4777 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.497393 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f98b7fd0-6b82-4bba-ac85-6ee9afa204aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.500147 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6" (OuterVolumeSpecName: "kube-api-access-bhnq6") pod "05b6aced-df73-44a5-9338-553f28ad707b" (UID: "05b6aced-df73-44a5-9338-553f28ad707b"). InnerVolumeSpecName "kube-api-access-bhnq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.533199 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data" (OuterVolumeSpecName: "config-data") pod "05b6aced-df73-44a5-9338-553f28ad707b" (UID: "05b6aced-df73-44a5-9338-553f28ad707b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.536060 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05b6aced-df73-44a5-9338-553f28ad707b" (UID: "05b6aced-df73-44a5-9338-553f28ad707b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.599032 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.599310 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhnq6\" (UniqueName: \"kubernetes.io/projected/05b6aced-df73-44a5-9338-553f28ad707b-kube-api-access-bhnq6\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.599411 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05b6aced-df73-44a5-9338-553f28ad707b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.865817 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29521321-ljtlr" event={"ID":"f98b7fd0-6b82-4bba-ac85-6ee9afa204aa","Type":"ContainerDied","Data":"f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b"} Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.866071 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a740687a6bf837121dd636aa96df0bcf5d2ebd5484f7727524e49780b0a30b" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.865834 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29521321-ljtlr" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.867370 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbefa3e5-4388-4b9e-9c93-6580727d021d","Type":"ContainerStarted","Data":"e04d4a17e92b64fb0d6b788b830a59fe05ffb18ff1456b080301c3c90ba58342"} Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.869140 4777 generic.go:334] "Generic (PLEG): container finished" podID="05b6aced-df73-44a5-9338-553f28ad707b" containerID="9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4" exitCode=0 Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.869177 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerDied","Data":"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4"} Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.869199 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05b6aced-df73-44a5-9338-553f28ad707b","Type":"ContainerDied","Data":"1a2fe4be23b3882f8ecbcb52329409378a3da9909c898af3abc2fc08dae53941"} Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.869200 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.869219 4777 scope.go:117] "RemoveContainer" containerID="9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.931571 4777 scope.go:117] "RemoveContainer" containerID="b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.967960 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.975891 4777 scope.go:117] "RemoveContainer" containerID="9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4" Feb 16 22:01:05 crc kubenswrapper[4777]: E0216 22:01:05.976395 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4\": container with ID starting with 9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4 not found: ID does not exist" containerID="9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.976447 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4"} err="failed to get container status \"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4\": rpc error: code = NotFound desc = could not find container \"9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4\": container with ID starting with 9e74b4dca48d4f0f5617c9d313a694881884d2abd7766de70a7fd90025c161b4 not found: ID does not exist" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.976478 4777 scope.go:117] "RemoveContainer" containerID="b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d" Feb 16 22:01:05 crc kubenswrapper[4777]: E0216 22:01:05.976839 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d\": container with ID starting with b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d not found: ID does not exist" containerID="b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.976883 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d"} err="failed to get container status \"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d\": rpc error: code = NotFound desc = could not find container \"b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d\": container with ID starting with b51c5fddd4cffe398acef5acc2749c1a44d10480e97b01357e5f3ae0b5a0142d not found: ID does not exist" Feb 16 22:01:05 crc kubenswrapper[4777]: I0216 22:01:05.996889 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.021773 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:06 crc kubenswrapper[4777]: E0216 22:01:06.022242 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" containerName="keystone-cron" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022259 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" containerName="keystone-cron" Feb 16 22:01:06 crc kubenswrapper[4777]: E0216 22:01:06.022272 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-api" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022279 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-api" Feb 16 22:01:06 crc kubenswrapper[4777]: E0216 22:01:06.022304 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-log" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022310 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-log" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022495 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-log" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022511 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b6aced-df73-44a5-9338-553f28ad707b" containerName="nova-api-api" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.022524 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98b7fd0-6b82-4bba-ac85-6ee9afa204aa" containerName="keystone-cron" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.023636 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.028093 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.028539 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.035053 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.037053 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.109877 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgxjm\" (UniqueName: \"kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.109971 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.109991 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.110014 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.110056 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.110095 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.194979 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b6aced-df73-44a5-9338-553f28ad707b" path="/var/lib/kubelet/pods/05b6aced-df73-44a5-9338-553f28ad707b/volumes" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211504 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgxjm\" (UniqueName: \"kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211602 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211620 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211641 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211680 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.211733 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.212098 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.217338 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.220159 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.222199 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.224206 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.228444 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgxjm\" (UniqueName: \"kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm\") pod \"nova-api-0\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.356915 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:06 crc kubenswrapper[4777]: W0216 22:01:06.856873 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c757f0_8b59_4616_9ca4_077981583104.slice/crio-0d331a6f42728143ee265364e17105a276ffdd2ebf5156626d67e542d5c79f96 WatchSource:0}: Error finding container 0d331a6f42728143ee265364e17105a276ffdd2ebf5156626d67e542d5c79f96: Status 404 returned error can't find the container with id 0d331a6f42728143ee265364e17105a276ffdd2ebf5156626d67e542d5c79f96 Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.859053 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.904179 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbefa3e5-4388-4b9e-9c93-6580727d021d","Type":"ContainerStarted","Data":"2bc6b95f725b02a53a6966d11ab213db6d7fb6079392efcb68d173a1e8173562"} Feb 16 22:01:06 crc kubenswrapper[4777]: I0216 22:01:06.906543 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerStarted","Data":"0d331a6f42728143ee265364e17105a276ffdd2ebf5156626d67e542d5c79f96"} Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.087305 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.111004 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.111056 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.117157 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.926612 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerStarted","Data":"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040"} Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.927165 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerStarted","Data":"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726"} Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.934737 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbefa3e5-4388-4b9e-9c93-6580727d021d","Type":"ContainerStarted","Data":"daf560d5106d9124d51a4772e8968f386b83b0e46f065fa71344552e881d6b6c"} Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.957226 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.957203612 podStartE2EDuration="2.957203612s" podCreationTimestamp="2026-02-16 22:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:07.951748309 +0000 UTC m=+1388.534249441" watchObservedRunningTime="2026-02-16 22:01:07.957203612 +0000 UTC m=+1388.539704734" Feb 16 22:01:07 crc kubenswrapper[4777]: I0216 22:01:07.970457 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.122938 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.122946 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.147285 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-l98fl"] Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.149064 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.157105 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.157304 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.157926 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l98fl"] Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.261556 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.261668 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.261754 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.261792 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblzh\" (UniqueName: \"kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.363807 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.363862 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblzh\" (UniqueName: \"kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.363922 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.364019 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.370238 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.375191 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.376587 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.382466 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblzh\" (UniqueName: \"kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh\") pod \"nova-cell1-cell-mapping-l98fl\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:08 crc kubenswrapper[4777]: I0216 22:01:08.470648 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.026562 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l98fl"] Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.330540 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-qsg2h" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.398250 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.398539 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="dnsmasq-dns" containerID="cri-o://80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7" gracePeriod=10 Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.941440 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.955415 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbefa3e5-4388-4b9e-9c93-6580727d021d","Type":"ContainerStarted","Data":"abf316980bd9e3e3af47433524226852de063c504c5cf3cc8e891b714431446e"} Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.955555 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.958047 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l98fl" event={"ID":"ee894943-d453-4455-be80-81a3c20ad9de","Type":"ContainerStarted","Data":"c0bca3a547bb50a2025f4e724de9e912dc95e5a230fb99abfaf552efaa427df9"} Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.958090 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l98fl" event={"ID":"ee894943-d453-4455-be80-81a3c20ad9de","Type":"ContainerStarted","Data":"1d60ea1e3c0756902e3e59da5cc1000c9f9316ca5576c676d883b28e723a082e"} Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.960569 4777 generic.go:334] "Generic (PLEG): container finished" podID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerID="80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7" exitCode=0 Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.960606 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" event={"ID":"37414fb0-84e5-436f-9e5e-d6bdaba54c8c","Type":"ContainerDied","Data":"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7"} Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.960646 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" event={"ID":"37414fb0-84e5-436f-9e5e-d6bdaba54c8c","Type":"ContainerDied","Data":"75cda06423bb8b5482f1ad1fb2fc57cbf606d90b07e0b014f990fd0fb2f98995"} Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.960663 4777 scope.go:117] "RemoveContainer" containerID="80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.960811 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-llbkr" Feb 16 22:01:09 crc kubenswrapper[4777]: I0216 22:01:09.999234 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-l98fl" podStartSLOduration=1.9992127819999999 podStartE2EDuration="1.999212782s" podCreationTimestamp="2026-02-16 22:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:09.991925588 +0000 UTC m=+1390.574426690" watchObservedRunningTime="2026-02-16 22:01:09.999212782 +0000 UTC m=+1390.581713874" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000486 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000555 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000596 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000774 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000799 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8fl2\" (UniqueName: \"kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.000848 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb\") pod \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\" (UID: \"37414fb0-84e5-436f-9e5e-d6bdaba54c8c\") " Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.001001 4777 scope.go:117] "RemoveContainer" containerID="37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.010137 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2" (OuterVolumeSpecName: "kube-api-access-d8fl2") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "kube-api-access-d8fl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.037176 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.018934096 podStartE2EDuration="7.037156225s" podCreationTimestamp="2026-02-16 22:01:03 +0000 UTC" firstStartedPulling="2026-02-16 22:01:04.804114453 +0000 UTC m=+1385.386615595" lastFinishedPulling="2026-02-16 22:01:08.822336622 +0000 UTC m=+1389.404837724" observedRunningTime="2026-02-16 22:01:10.03235427 +0000 UTC m=+1390.614855372" watchObservedRunningTime="2026-02-16 22:01:10.037156225 +0000 UTC m=+1390.619657327" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.071323 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.071785 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config" (OuterVolumeSpecName: "config") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.101733 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.104056 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.104089 4777 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-config\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.104102 4777 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.104115 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8fl2\" (UniqueName: \"kubernetes.io/projected/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-kube-api-access-d8fl2\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.118324 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.132262 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37414fb0-84e5-436f-9e5e-d6bdaba54c8c" (UID: "37414fb0-84e5-436f-9e5e-d6bdaba54c8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.207608 4777 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.207641 4777 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37414fb0-84e5-436f-9e5e-d6bdaba54c8c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.220656 4777 scope.go:117] "RemoveContainer" containerID="80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7" Feb 16 22:01:10 crc kubenswrapper[4777]: E0216 22:01:10.221149 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7\": container with ID starting with 80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7 not found: ID does not exist" containerID="80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.221216 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7"} err="failed to get container status \"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7\": rpc error: code = NotFound desc = could not find container \"80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7\": container with ID starting with 80ad145ab3f0f643deb70c14e39efd5c24ac08ffa0de8dff891c178ae472a3f7 not found: ID does not exist" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.221247 4777 scope.go:117] "RemoveContainer" containerID="37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612" Feb 16 22:01:10 crc kubenswrapper[4777]: E0216 22:01:10.221796 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612\": container with ID starting with 37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612 not found: ID does not exist" containerID="37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.221830 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612"} err="failed to get container status \"37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612\": rpc error: code = NotFound desc = could not find container \"37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612\": container with ID starting with 37e387ab7dc796fb6fa890ebd00b063e8bde9687a8067fe68027e30d7c3fe612 not found: ID does not exist" Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.292883 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:01:10 crc kubenswrapper[4777]: I0216 22:01:10.304432 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-llbkr"] Feb 16 22:01:12 crc kubenswrapper[4777]: I0216 22:01:12.193675 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" path="/var/lib/kubelet/pods/37414fb0-84e5-436f-9e5e-d6bdaba54c8c/volumes" Feb 16 22:01:13 crc kubenswrapper[4777]: E0216 22:01:13.183706 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:01:14 crc kubenswrapper[4777]: I0216 22:01:14.021199 4777 generic.go:334] "Generic (PLEG): container finished" podID="ee894943-d453-4455-be80-81a3c20ad9de" containerID="c0bca3a547bb50a2025f4e724de9e912dc95e5a230fb99abfaf552efaa427df9" exitCode=0 Feb 16 22:01:14 crc kubenswrapper[4777]: I0216 22:01:14.021248 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l98fl" event={"ID":"ee894943-d453-4455-be80-81a3c20ad9de","Type":"ContainerDied","Data":"c0bca3a547bb50a2025f4e724de9e912dc95e5a230fb99abfaf552efaa427df9"} Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.447431 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.517703 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data\") pod \"ee894943-d453-4455-be80-81a3c20ad9de\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.518082 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblzh\" (UniqueName: \"kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh\") pod \"ee894943-d453-4455-be80-81a3c20ad9de\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.518176 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts\") pod \"ee894943-d453-4455-be80-81a3c20ad9de\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.518250 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle\") pod \"ee894943-d453-4455-be80-81a3c20ad9de\" (UID: \"ee894943-d453-4455-be80-81a3c20ad9de\") " Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.526513 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts" (OuterVolumeSpecName: "scripts") pod "ee894943-d453-4455-be80-81a3c20ad9de" (UID: "ee894943-d453-4455-be80-81a3c20ad9de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.532891 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh" (OuterVolumeSpecName: "kube-api-access-xblzh") pod "ee894943-d453-4455-be80-81a3c20ad9de" (UID: "ee894943-d453-4455-be80-81a3c20ad9de"). InnerVolumeSpecName "kube-api-access-xblzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.553609 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data" (OuterVolumeSpecName: "config-data") pod "ee894943-d453-4455-be80-81a3c20ad9de" (UID: "ee894943-d453-4455-be80-81a3c20ad9de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.562005 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee894943-d453-4455-be80-81a3c20ad9de" (UID: "ee894943-d453-4455-be80-81a3c20ad9de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.620513 4777 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-scripts\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.620543 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.620555 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee894943-d453-4455-be80-81a3c20ad9de-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:15 crc kubenswrapper[4777]: I0216 22:01:15.620564 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblzh\" (UniqueName: \"kubernetes.io/projected/ee894943-d453-4455-be80-81a3c20ad9de-kube-api-access-xblzh\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.077615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l98fl" event={"ID":"ee894943-d453-4455-be80-81a3c20ad9de","Type":"ContainerDied","Data":"1d60ea1e3c0756902e3e59da5cc1000c9f9316ca5576c676d883b28e723a082e"} Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.077660 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d60ea1e3c0756902e3e59da5cc1000c9f9316ca5576c676d883b28e723a082e" Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.077825 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l98fl" Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.250968 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.251187 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-log" containerID="cri-o://a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" gracePeriod=30 Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.251294 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-api" containerID="cri-o://8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" gracePeriod=30 Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.277405 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.277691 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1602cea5-c845-470a-812e-6c1f22e201be" containerName="nova-scheduler-scheduler" containerID="cri-o://47282fb44dcc1039dadf7c3d08f3d6d5c5632e82239bf7ab7a57ea2267ab6439" gracePeriod=30 Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.299236 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.299803 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-log" containerID="cri-o://17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d" gracePeriod=30 Feb 16 22:01:16 crc kubenswrapper[4777]: I0216 22:01:16.300065 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-metadata" containerID="cri-o://0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832" gracePeriod=30 Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.054749 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097455 4777 generic.go:334] "Generic (PLEG): container finished" podID="55c757f0-8b59-4616-9ca4-077981583104" containerID="8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" exitCode=0 Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097485 4777 generic.go:334] "Generic (PLEG): container finished" podID="55c757f0-8b59-4616-9ca4-077981583104" containerID="a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" exitCode=143 Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097533 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerDied","Data":"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040"} Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097560 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerDied","Data":"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726"} Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097570 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55c757f0-8b59-4616-9ca4-077981583104","Type":"ContainerDied","Data":"0d331a6f42728143ee265364e17105a276ffdd2ebf5156626d67e542d5c79f96"} Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097535 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.097584 4777 scope.go:117] "RemoveContainer" containerID="8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.100940 4777 generic.go:334] "Generic (PLEG): container finished" podID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerID="17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d" exitCode=143 Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.101014 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerDied","Data":"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d"} Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.102333 4777 generic.go:334] "Generic (PLEG): container finished" podID="1602cea5-c845-470a-812e-6c1f22e201be" containerID="47282fb44dcc1039dadf7c3d08f3d6d5c5632e82239bf7ab7a57ea2267ab6439" exitCode=0 Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.102370 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1602cea5-c845-470a-812e-6c1f22e201be","Type":"ContainerDied","Data":"47282fb44dcc1039dadf7c3d08f3d6d5c5632e82239bf7ab7a57ea2267ab6439"} Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.122858 4777 scope.go:117] "RemoveContainer" containerID="a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.143512 4777 scope.go:117] "RemoveContainer" containerID="8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.144207 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040\": container with ID starting with 8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040 not found: ID does not exist" containerID="8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144267 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040"} err="failed to get container status \"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040\": rpc error: code = NotFound desc = could not find container \"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040\": container with ID starting with 8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040 not found: ID does not exist" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144296 4777 scope.go:117] "RemoveContainer" containerID="a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.144549 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726\": container with ID starting with a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726 not found: ID does not exist" containerID="a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144577 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726"} err="failed to get container status \"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726\": rpc error: code = NotFound desc = could not find container \"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726\": container with ID starting with a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726 not found: ID does not exist" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144606 4777 scope.go:117] "RemoveContainer" containerID="8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144820 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040"} err="failed to get container status \"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040\": rpc error: code = NotFound desc = could not find container \"8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040\": container with ID starting with 8536856660ae10038bbde792eabe888e3f336b5f79b81bb569d8898bbd830040 not found: ID does not exist" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.144841 4777 scope.go:117] "RemoveContainer" containerID="a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.145041 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726"} err="failed to get container status \"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726\": rpc error: code = NotFound desc = could not find container \"a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726\": container with ID starting with a471fca4c2abca809b917486303c7a2df38cdaa006611552d3bd1b522c40a726 not found: ID does not exist" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156338 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgxjm\" (UniqueName: \"kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156458 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156575 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156661 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156682 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.156734 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle\") pod \"55c757f0-8b59-4616-9ca4-077981583104\" (UID: \"55c757f0-8b59-4616-9ca4-077981583104\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.158444 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs" (OuterVolumeSpecName: "logs") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.162211 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm" (OuterVolumeSpecName: "kube-api-access-fgxjm") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "kube-api-access-fgxjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.193827 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.199487 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data" (OuterVolumeSpecName: "config-data") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.203506 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.252844 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.253453 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55c757f0-8b59-4616-9ca4-077981583104" (UID: "55c757f0-8b59-4616-9ca4-077981583104"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259452 4777 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259482 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55c757f0-8b59-4616-9ca4-077981583104-logs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259492 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259500 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259509 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgxjm\" (UniqueName: \"kubernetes.io/projected/55c757f0-8b59-4616-9ca4-077981583104-kube-api-access-fgxjm\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.259519 4777 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c757f0-8b59-4616-9ca4-077981583104-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.371826 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle\") pod \"1602cea5-c845-470a-812e-6c1f22e201be\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.372000 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvzjh\" (UniqueName: \"kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh\") pod \"1602cea5-c845-470a-812e-6c1f22e201be\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.372052 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data\") pod \"1602cea5-c845-470a-812e-6c1f22e201be\" (UID: \"1602cea5-c845-470a-812e-6c1f22e201be\") " Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.380257 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh" (OuterVolumeSpecName: "kube-api-access-qvzjh") pod "1602cea5-c845-470a-812e-6c1f22e201be" (UID: "1602cea5-c845-470a-812e-6c1f22e201be"). InnerVolumeSpecName "kube-api-access-qvzjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.399005 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1602cea5-c845-470a-812e-6c1f22e201be" (UID: "1602cea5-c845-470a-812e-6c1f22e201be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.399665 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data" (OuterVolumeSpecName: "config-data") pod "1602cea5-c845-470a-812e-6c1f22e201be" (UID: "1602cea5-c845-470a-812e-6c1f22e201be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.474304 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.474344 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvzjh\" (UniqueName: \"kubernetes.io/projected/1602cea5-c845-470a-812e-6c1f22e201be-kube-api-access-qvzjh\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.474361 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1602cea5-c845-470a-812e-6c1f22e201be-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.493254 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.504568 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.517920 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518383 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-api" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518402 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-api" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518419 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee894943-d453-4455-be80-81a3c20ad9de" containerName="nova-manage" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518426 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee894943-d453-4455-be80-81a3c20ad9de" containerName="nova-manage" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518440 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1602cea5-c845-470a-812e-6c1f22e201be" containerName="nova-scheduler-scheduler" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518446 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="1602cea5-c845-470a-812e-6c1f22e201be" containerName="nova-scheduler-scheduler" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518456 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-log" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518462 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-log" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518473 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="dnsmasq-dns" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518479 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="dnsmasq-dns" Feb 16 22:01:17 crc kubenswrapper[4777]: E0216 22:01:17.518490 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="init" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518496 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="init" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518699 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="1602cea5-c845-470a-812e-6c1f22e201be" containerName="nova-scheduler-scheduler" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518724 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="37414fb0-84e5-436f-9e5e-d6bdaba54c8c" containerName="dnsmasq-dns" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518733 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-api" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518749 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee894943-d453-4455-be80-81a3c20ad9de" containerName="nova-manage" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.518760 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c757f0-8b59-4616-9ca4-077981583104" containerName="nova-api-log" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.519914 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.522301 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.522466 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.522763 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.533791 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.678175 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-config-data\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.678282 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.678418 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.678586 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.678887 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qxd\" (UniqueName: \"kubernetes.io/projected/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-kube-api-access-f7qxd\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.679162 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-logs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.780731 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-config-data\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.780815 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.780914 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.780983 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.781111 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qxd\" (UniqueName: \"kubernetes.io/projected/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-kube-api-access-f7qxd\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.781266 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-logs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.781626 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-logs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.784097 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.785206 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.787076 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-config-data\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.790853 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.813316 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qxd\" (UniqueName: \"kubernetes.io/projected/bbc0152a-9610-4701-b7f9-e2ae9ddcf53a-kube-api-access-f7qxd\") pod \"nova-api-0\" (UID: \"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a\") " pod="openstack/nova-api-0" Feb 16 22:01:17 crc kubenswrapper[4777]: I0216 22:01:17.840066 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.115287 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1602cea5-c845-470a-812e-6c1f22e201be","Type":"ContainerDied","Data":"413bca43c2e561148a417b4e1df6cc56f6cafcb517bb35c5eb3195b0614e9d50"} Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.115595 4777 scope.go:117] "RemoveContainer" containerID="47282fb44dcc1039dadf7c3d08f3d6d5c5632e82239bf7ab7a57ea2267ab6439" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.115873 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.162480 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.177804 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.197408 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1602cea5-c845-470a-812e-6c1f22e201be" path="/var/lib/kubelet/pods/1602cea5-c845-470a-812e-6c1f22e201be/volumes" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.198119 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c757f0-8b59-4616-9ca4-077981583104" path="/var/lib/kubelet/pods/55c757f0-8b59-4616-9ca4-077981583104/volumes" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.200436 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.203264 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.205073 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.210659 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.294838 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52t5v\" (UniqueName: \"kubernetes.io/projected/265517fa-85cd-4110-b910-69090b69be97-kube-api-access-52t5v\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.295045 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.295186 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-config-data\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.374257 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 16 22:01:18 crc kubenswrapper[4777]: W0216 22:01:18.384777 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbc0152a_9610_4701_b7f9_e2ae9ddcf53a.slice/crio-5bfa06995177134bf07626ab0ecc8a2702d75b2f66ee5b462a0c80576995d3bc WatchSource:0}: Error finding container 5bfa06995177134bf07626ab0ecc8a2702d75b2f66ee5b462a0c80576995d3bc: Status 404 returned error can't find the container with id 5bfa06995177134bf07626ab0ecc8a2702d75b2f66ee5b462a0c80576995d3bc Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.399140 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-config-data\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.399209 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52t5v\" (UniqueName: \"kubernetes.io/projected/265517fa-85cd-4110-b910-69090b69be97-kube-api-access-52t5v\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.399309 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.404759 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-config-data\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.406068 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265517fa-85cd-4110-b910-69090b69be97-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.419379 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52t5v\" (UniqueName: \"kubernetes.io/projected/265517fa-85cd-4110-b910-69090b69be97-kube-api-access-52t5v\") pod \"nova-scheduler-0\" (UID: \"265517fa-85cd-4110-b910-69090b69be97\") " pod="openstack/nova-scheduler-0" Feb 16 22:01:18 crc kubenswrapper[4777]: I0216 22:01:18.523873 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.035428 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 16 22:01:19 crc kubenswrapper[4777]: W0216 22:01:19.038222 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265517fa_85cd_4110_b910_69090b69be97.slice/crio-cb1aac4265b1afa2c7e33dafe7f08f09396ba35c5a7c08345256a73514c1a240 WatchSource:0}: Error finding container cb1aac4265b1afa2c7e33dafe7f08f09396ba35c5a7c08345256a73514c1a240: Status 404 returned error can't find the container with id cb1aac4265b1afa2c7e33dafe7f08f09396ba35c5a7c08345256a73514c1a240 Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.141457 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a","Type":"ContainerStarted","Data":"b8898cf0e5b79f0c26f2afbe7f5972be1a05140cd29d77b52f5d8bf99ffa5c53"} Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.141519 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a","Type":"ContainerStarted","Data":"314901600a92fbbd0f051f715677369bd0031df091f66897f16d2db671a31902"} Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.141540 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbc0152a-9610-4701-b7f9-e2ae9ddcf53a","Type":"ContainerStarted","Data":"5bfa06995177134bf07626ab0ecc8a2702d75b2f66ee5b462a0c80576995d3bc"} Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.145417 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"265517fa-85cd-4110-b910-69090b69be97","Type":"ContainerStarted","Data":"cb1aac4265b1afa2c7e33dafe7f08f09396ba35c5a7c08345256a73514c1a240"} Feb 16 22:01:19 crc kubenswrapper[4777]: I0216 22:01:19.163381 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.163355724 podStartE2EDuration="2.163355724s" podCreationTimestamp="2026-02-16 22:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:19.161238665 +0000 UTC m=+1399.743739797" watchObservedRunningTime="2026-02-16 22:01:19.163355724 +0000 UTC m=+1399.745856866" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.031002 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.139109 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phvw7\" (UniqueName: \"kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7\") pod \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.139866 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs\") pod \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.139938 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle\") pod \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.140083 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs\") pod \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.140244 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data\") pod \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\" (UID: \"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d\") " Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.140347 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs" (OuterVolumeSpecName: "logs") pod "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" (UID: "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.140984 4777 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-logs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.156263 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7" (OuterVolumeSpecName: "kube-api-access-phvw7") pod "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" (UID: "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d"). InnerVolumeSpecName "kube-api-access-phvw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.160208 4777 generic.go:334] "Generic (PLEG): container finished" podID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerID="0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832" exitCode=0 Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.160252 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerDied","Data":"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832"} Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.160278 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ed3d1bb-e1a8-4a70-a364-8f688b13f68d","Type":"ContainerDied","Data":"8b25a2352e191b32986c3893391ddd94f40620c6f8d7913d1aafbefe2b6f9ab8"} Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.160295 4777 scope.go:117] "RemoveContainer" containerID="0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.160379 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.163745 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"265517fa-85cd-4110-b910-69090b69be97","Type":"ContainerStarted","Data":"96d03bfcc902369467e09e9af16302e32e347ed8f15d4befbe6dacb683cb63ea"} Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.168818 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" (UID: "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.214924 4777 scope.go:117] "RemoveContainer" containerID="17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.216869 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" (UID: "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.224582 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data" (OuterVolumeSpecName: "config-data") pod "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" (UID: "0ed3d1bb-e1a8-4a70-a364-8f688b13f68d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243063 4777 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243090 4777 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-config-data\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243102 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phvw7\" (UniqueName: \"kubernetes.io/projected/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-kube-api-access-phvw7\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243111 4777 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243300 4777 scope.go:117] "RemoveContainer" containerID="0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832" Feb 16 22:01:20 crc kubenswrapper[4777]: E0216 22:01:20.243739 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832\": container with ID starting with 0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832 not found: ID does not exist" containerID="0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243785 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832"} err="failed to get container status \"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832\": rpc error: code = NotFound desc = could not find container \"0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832\": container with ID starting with 0307f1fecf4dd7c7a903b30b2848cf963879a554ea5b9ef81aa5d9696771d832 not found: ID does not exist" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.243812 4777 scope.go:117] "RemoveContainer" containerID="17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d" Feb 16 22:01:20 crc kubenswrapper[4777]: E0216 22:01:20.244173 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d\": container with ID starting with 17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d not found: ID does not exist" containerID="17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.244203 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d"} err="failed to get container status \"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d\": rpc error: code = NotFound desc = could not find container \"17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d\": container with ID starting with 17f7562976354e6190824a8dad5e54b6271d23897b80f704727f7b3801c5d81d not found: ID does not exist" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.503646 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.503628971 podStartE2EDuration="2.503628971s" podCreationTimestamp="2026-02-16 22:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:20.194625687 +0000 UTC m=+1400.777126789" watchObservedRunningTime="2026-02-16 22:01:20.503628971 +0000 UTC m=+1401.086130073" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.505273 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.526782 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.535539 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:20 crc kubenswrapper[4777]: E0216 22:01:20.536212 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-log" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.536243 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-log" Feb 16 22:01:20 crc kubenswrapper[4777]: E0216 22:01:20.536298 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-metadata" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.536311 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-metadata" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.536656 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-metadata" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.536687 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" containerName="nova-metadata-log" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.538577 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.540745 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.541016 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.545961 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.650693 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brlj2\" (UniqueName: \"kubernetes.io/projected/376df6ac-d55a-46a9-9b11-893215a316a7-kube-api-access-brlj2\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.650741 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.650822 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376df6ac-d55a-46a9-9b11-893215a316a7-logs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.650870 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-config-data\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.650886 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752257 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376df6ac-d55a-46a9-9b11-893215a316a7-logs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752638 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376df6ac-d55a-46a9-9b11-893215a316a7-logs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752703 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-config-data\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752755 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752891 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brlj2\" (UniqueName: \"kubernetes.io/projected/376df6ac-d55a-46a9-9b11-893215a316a7-kube-api-access-brlj2\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.752918 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.759416 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.760154 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.774769 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376df6ac-d55a-46a9-9b11-893215a316a7-config-data\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.783944 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brlj2\" (UniqueName: \"kubernetes.io/projected/376df6ac-d55a-46a9-9b11-893215a316a7-kube-api-access-brlj2\") pod \"nova-metadata-0\" (UID: \"376df6ac-d55a-46a9-9b11-893215a316a7\") " pod="openstack/nova-metadata-0" Feb 16 22:01:20 crc kubenswrapper[4777]: I0216 22:01:20.910325 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 16 22:01:21 crc kubenswrapper[4777]: W0216 22:01:21.419154 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod376df6ac_d55a_46a9_9b11_893215a316a7.slice/crio-39a9fbc8a5c5cb18ea345ebe4a78a4ba961bf2c368962ed594d6b61e9558f5b4 WatchSource:0}: Error finding container 39a9fbc8a5c5cb18ea345ebe4a78a4ba961bf2c368962ed594d6b61e9558f5b4: Status 404 returned error can't find the container with id 39a9fbc8a5c5cb18ea345ebe4a78a4ba961bf2c368962ed594d6b61e9558f5b4 Feb 16 22:01:21 crc kubenswrapper[4777]: I0216 22:01:21.426391 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 16 22:01:22 crc kubenswrapper[4777]: I0216 22:01:22.193835 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed3d1bb-e1a8-4a70-a364-8f688b13f68d" path="/var/lib/kubelet/pods/0ed3d1bb-e1a8-4a70-a364-8f688b13f68d/volumes" Feb 16 22:01:22 crc kubenswrapper[4777]: I0216 22:01:22.200392 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"376df6ac-d55a-46a9-9b11-893215a316a7","Type":"ContainerStarted","Data":"d29163b8d8ae1492b1e5e327b01265948e18c109c6726388f1abbf25239217ae"} Feb 16 22:01:22 crc kubenswrapper[4777]: I0216 22:01:22.200436 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"376df6ac-d55a-46a9-9b11-893215a316a7","Type":"ContainerStarted","Data":"b2e0da3a66adb3180f58974ad7f4a3b6cb0e84f789e350ebfd32a1d49c2b588c"} Feb 16 22:01:22 crc kubenswrapper[4777]: I0216 22:01:22.200449 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"376df6ac-d55a-46a9-9b11-893215a316a7","Type":"ContainerStarted","Data":"39a9fbc8a5c5cb18ea345ebe4a78a4ba961bf2c368962ed594d6b61e9558f5b4"} Feb 16 22:01:22 crc kubenswrapper[4777]: I0216 22:01:22.234830 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.234808526 podStartE2EDuration="2.234808526s" podCreationTimestamp="2026-02-16 22:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-16 22:01:22.224174238 +0000 UTC m=+1402.806675370" watchObservedRunningTime="2026-02-16 22:01:22.234808526 +0000 UTC m=+1402.817309638" Feb 16 22:01:23 crc kubenswrapper[4777]: I0216 22:01:23.524521 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 16 22:01:25 crc kubenswrapper[4777]: I0216 22:01:25.910957 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:01:25 crc kubenswrapper[4777]: I0216 22:01:25.911460 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 16 22:01:26 crc kubenswrapper[4777]: E0216 22:01:26.325129 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:01:26 crc kubenswrapper[4777]: E0216 22:01:26.325215 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:01:26 crc kubenswrapper[4777]: E0216 22:01:26.325396 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:01:26 crc kubenswrapper[4777]: E0216 22:01:26.327156 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:01:27 crc kubenswrapper[4777]: I0216 22:01:27.841295 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:01:27 crc kubenswrapper[4777]: I0216 22:01:27.841695 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 16 22:01:28 crc kubenswrapper[4777]: I0216 22:01:28.524026 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 16 22:01:28 crc kubenswrapper[4777]: I0216 22:01:28.570535 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 16 22:01:28 crc kubenswrapper[4777]: I0216 22:01:28.855938 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbc0152a-9610-4701-b7f9-e2ae9ddcf53a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:28 crc kubenswrapper[4777]: I0216 22:01:28.855977 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbc0152a-9610-4701-b7f9-e2ae9ddcf53a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:29 crc kubenswrapper[4777]: I0216 22:01:29.343414 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 16 22:01:30 crc kubenswrapper[4777]: I0216 22:01:30.910891 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 16 22:01:30 crc kubenswrapper[4777]: I0216 22:01:30.910930 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 16 22:01:31 crc kubenswrapper[4777]: I0216 22:01:31.923875 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="376df6ac-d55a-46a9-9b11-893215a316a7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:31 crc kubenswrapper[4777]: I0216 22:01:31.923955 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="376df6ac-d55a-46a9-9b11-893215a316a7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 16 22:01:34 crc kubenswrapper[4777]: I0216 22:01:34.284249 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 16 22:01:37 crc kubenswrapper[4777]: E0216 22:01:37.184144 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:01:37 crc kubenswrapper[4777]: I0216 22:01:37.848201 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 16 22:01:37 crc kubenswrapper[4777]: I0216 22:01:37.848978 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 16 22:01:37 crc kubenswrapper[4777]: I0216 22:01:37.852503 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 16 22:01:37 crc kubenswrapper[4777]: I0216 22:01:37.859677 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 16 22:01:38 crc kubenswrapper[4777]: I0216 22:01:38.428939 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 16 22:01:38 crc kubenswrapper[4777]: I0216 22:01:38.433686 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 16 22:01:40 crc kubenswrapper[4777]: I0216 22:01:40.921268 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 16 22:01:40 crc kubenswrapper[4777]: I0216 22:01:40.921765 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 16 22:01:40 crc kubenswrapper[4777]: I0216 22:01:40.930745 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 16 22:01:40 crc kubenswrapper[4777]: I0216 22:01:40.933748 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 16 22:01:41 crc kubenswrapper[4777]: I0216 22:01:41.963912 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:01:41 crc kubenswrapper[4777]: I0216 22:01:41.965779 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:41 crc kubenswrapper[4777]: I0216 22:01:41.983250 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.055067 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cd2q\" (UniqueName: \"kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.055130 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.055328 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.157340 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cd2q\" (UniqueName: \"kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.157421 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.157489 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.158145 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.158759 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.191592 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cd2q\" (UniqueName: \"kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q\") pod \"redhat-operators-gb4n6\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.313729 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:42 crc kubenswrapper[4777]: W0216 22:01:42.835564 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d0e8a9_b40c_48d4_a1b0_0c2de37f2bc1.slice/crio-3461fa4dfce46a505fc9bc9f1f0c42f2fe232bbd20c40ad6448e5fd7d4b4c9d6 WatchSource:0}: Error finding container 3461fa4dfce46a505fc9bc9f1f0c42f2fe232bbd20c40ad6448e5fd7d4b4c9d6: Status 404 returned error can't find the container with id 3461fa4dfce46a505fc9bc9f1f0c42f2fe232bbd20c40ad6448e5fd7d4b4c9d6 Feb 16 22:01:42 crc kubenswrapper[4777]: I0216 22:01:42.835976 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:01:43 crc kubenswrapper[4777]: I0216 22:01:43.485168 4777 generic.go:334] "Generic (PLEG): container finished" podID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerID="6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb" exitCode=0 Feb 16 22:01:43 crc kubenswrapper[4777]: I0216 22:01:43.485225 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerDied","Data":"6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb"} Feb 16 22:01:43 crc kubenswrapper[4777]: I0216 22:01:43.485284 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerStarted","Data":"3461fa4dfce46a505fc9bc9f1f0c42f2fe232bbd20c40ad6448e5fd7d4b4c9d6"} Feb 16 22:01:44 crc kubenswrapper[4777]: I0216 22:01:44.499805 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerStarted","Data":"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6"} Feb 16 22:01:46 crc kubenswrapper[4777]: I0216 22:01:46.530702 4777 generic.go:334] "Generic (PLEG): container finished" podID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerID="e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6" exitCode=0 Feb 16 22:01:46 crc kubenswrapper[4777]: I0216 22:01:46.530930 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerDied","Data":"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6"} Feb 16 22:01:47 crc kubenswrapper[4777]: I0216 22:01:47.549011 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerStarted","Data":"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7"} Feb 16 22:01:47 crc kubenswrapper[4777]: I0216 22:01:47.582148 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gb4n6" podStartSLOduration=3.085612394 podStartE2EDuration="6.582126081s" podCreationTimestamp="2026-02-16 22:01:41 +0000 UTC" firstStartedPulling="2026-02-16 22:01:43.48669966 +0000 UTC m=+1424.069200762" lastFinishedPulling="2026-02-16 22:01:46.983213317 +0000 UTC m=+1427.565714449" observedRunningTime="2026-02-16 22:01:47.578590442 +0000 UTC m=+1428.161091564" watchObservedRunningTime="2026-02-16 22:01:47.582126081 +0000 UTC m=+1428.164627193" Feb 16 22:01:49 crc kubenswrapper[4777]: E0216 22:01:49.184775 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:01:52 crc kubenswrapper[4777]: I0216 22:01:52.314130 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:52 crc kubenswrapper[4777]: I0216 22:01:52.314468 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:01:53 crc kubenswrapper[4777]: I0216 22:01:53.396073 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gb4n6" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="registry-server" probeResult="failure" output=< Feb 16 22:01:53 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 22:01:53 crc kubenswrapper[4777]: > Feb 16 22:02:02 crc kubenswrapper[4777]: E0216 22:02:02.184968 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:02:02 crc kubenswrapper[4777]: I0216 22:02:02.389295 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:02:02 crc kubenswrapper[4777]: I0216 22:02:02.473269 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:02:02 crc kubenswrapper[4777]: I0216 22:02:02.639940 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:02:03 crc kubenswrapper[4777]: I0216 22:02:03.800868 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gb4n6" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="registry-server" containerID="cri-o://dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7" gracePeriod=2 Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.391337 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.563006 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cd2q\" (UniqueName: \"kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q\") pod \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.563207 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities\") pod \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.563352 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content\") pod \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\" (UID: \"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1\") " Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.564015 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities" (OuterVolumeSpecName: "utilities") pod "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" (UID: "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.564757 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.587841 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q" (OuterVolumeSpecName: "kube-api-access-6cd2q") pod "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" (UID: "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1"). InnerVolumeSpecName "kube-api-access-6cd2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.667186 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cd2q\" (UniqueName: \"kubernetes.io/projected/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-kube-api-access-6cd2q\") on node \"crc\" DevicePath \"\"" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.725819 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" (UID: "07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.769802 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.816593 4777 generic.go:334] "Generic (PLEG): container finished" podID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerID="dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7" exitCode=0 Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.816636 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerDied","Data":"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7"} Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.816663 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gb4n6" event={"ID":"07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1","Type":"ContainerDied","Data":"3461fa4dfce46a505fc9bc9f1f0c42f2fe232bbd20c40ad6448e5fd7d4b4c9d6"} Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.816681 4777 scope.go:117] "RemoveContainer" containerID="dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.816780 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gb4n6" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.865975 4777 scope.go:117] "RemoveContainer" containerID="e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6" Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.868637 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.878074 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gb4n6"] Feb 16 22:02:04 crc kubenswrapper[4777]: I0216 22:02:04.899941 4777 scope.go:117] "RemoveContainer" containerID="6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.002560 4777 scope.go:117] "RemoveContainer" containerID="dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7" Feb 16 22:02:05 crc kubenswrapper[4777]: E0216 22:02:05.003034 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7\": container with ID starting with dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7 not found: ID does not exist" containerID="dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.003066 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7"} err="failed to get container status \"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7\": rpc error: code = NotFound desc = could not find container \"dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7\": container with ID starting with dc4532dd56d66a06360a23bd693bfece217e4457005617878fd337e0b0dc07b7 not found: ID does not exist" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.003087 4777 scope.go:117] "RemoveContainer" containerID="e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6" Feb 16 22:02:05 crc kubenswrapper[4777]: E0216 22:02:05.003852 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6\": container with ID starting with e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6 not found: ID does not exist" containerID="e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.004028 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6"} err="failed to get container status \"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6\": rpc error: code = NotFound desc = could not find container \"e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6\": container with ID starting with e1c53f95ee3ba2f903b53c1aa15461546d2890502f6c78cd433a5e220cbab2b6 not found: ID does not exist" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.004078 4777 scope.go:117] "RemoveContainer" containerID="6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb" Feb 16 22:02:05 crc kubenswrapper[4777]: E0216 22:02:05.004438 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb\": container with ID starting with 6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb not found: ID does not exist" containerID="6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb" Feb 16 22:02:05 crc kubenswrapper[4777]: I0216 22:02:05.004462 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb"} err="failed to get container status \"6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb\": rpc error: code = NotFound desc = could not find container \"6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb\": container with ID starting with 6f2541952b99bf35623ac96b31979374afb015ccf9ebfcf2d89dbf0f6835bdeb not found: ID does not exist" Feb 16 22:02:06 crc kubenswrapper[4777]: I0216 22:02:06.203039 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" path="/var/lib/kubelet/pods/07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1/volumes" Feb 16 22:02:15 crc kubenswrapper[4777]: E0216 22:02:15.184798 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:02:29 crc kubenswrapper[4777]: E0216 22:02:29.185123 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:02:42 crc kubenswrapper[4777]: E0216 22:02:42.186350 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:02:56 crc kubenswrapper[4777]: E0216 22:02:56.184745 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:03:07 crc kubenswrapper[4777]: E0216 22:03:07.184909 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:03:11 crc kubenswrapper[4777]: I0216 22:03:11.652211 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:03:11 crc kubenswrapper[4777]: I0216 22:03:11.653072 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:03:18 crc kubenswrapper[4777]: E0216 22:03:18.184079 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:03:30 crc kubenswrapper[4777]: E0216 22:03:30.203565 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:03:41 crc kubenswrapper[4777]: I0216 22:03:41.651767 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:03:41 crc kubenswrapper[4777]: I0216 22:03:41.652251 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:03:42 crc kubenswrapper[4777]: E0216 22:03:42.184126 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:03:57 crc kubenswrapper[4777]: E0216 22:03:57.184998 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:04:01 crc kubenswrapper[4777]: I0216 22:04:01.970768 4777 scope.go:117] "RemoveContainer" containerID="6b6202e2fcd41bc1f83a6eb4180a9b5138121d6dfe1affff17682819aeeee2ff" Feb 16 22:04:02 crc kubenswrapper[4777]: I0216 22:04:02.021603 4777 scope.go:117] "RemoveContainer" containerID="4630e87ac1c542d44341d25d6197690ce1103bedcb540f3ca3b1cf8df95d5ed8" Feb 16 22:04:11 crc kubenswrapper[4777]: I0216 22:04:11.651656 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:04:11 crc kubenswrapper[4777]: I0216 22:04:11.652001 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:04:11 crc kubenswrapper[4777]: I0216 22:04:11.652045 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:04:11 crc kubenswrapper[4777]: I0216 22:04:11.652766 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:04:11 crc kubenswrapper[4777]: I0216 22:04:11.652811 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" gracePeriod=600 Feb 16 22:04:11 crc kubenswrapper[4777]: E0216 22:04:11.781035 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:04:12 crc kubenswrapper[4777]: I0216 22:04:12.448156 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" exitCode=0 Feb 16 22:04:12 crc kubenswrapper[4777]: I0216 22:04:12.448234 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2"} Feb 16 22:04:12 crc kubenswrapper[4777]: I0216 22:04:12.448498 4777 scope.go:117] "RemoveContainer" containerID="79b7f65a1dac7c39999487fce7121e7b9f97dac2a41e3e237c9dfb015a4112fd" Feb 16 22:04:12 crc kubenswrapper[4777]: I0216 22:04:12.449262 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:04:12 crc kubenswrapper[4777]: E0216 22:04:12.449684 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:04:12 crc kubenswrapper[4777]: E0216 22:04:12.468836 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:04:12 crc kubenswrapper[4777]: E0216 22:04:12.468903 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:04:12 crc kubenswrapper[4777]: E0216 22:04:12.469049 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:04:12 crc kubenswrapper[4777]: E0216 22:04:12.470256 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:04:24 crc kubenswrapper[4777]: E0216 22:04:24.186548 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:04:25 crc kubenswrapper[4777]: I0216 22:04:25.182262 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:04:25 crc kubenswrapper[4777]: E0216 22:04:25.183078 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:04:36 crc kubenswrapper[4777]: I0216 22:04:36.182519 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:04:36 crc kubenswrapper[4777]: E0216 22:04:36.183573 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:04:36 crc kubenswrapper[4777]: E0216 22:04:36.185399 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:04:49 crc kubenswrapper[4777]: E0216 22:04:49.186788 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:04:50 crc kubenswrapper[4777]: I0216 22:04:50.189442 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:04:50 crc kubenswrapper[4777]: E0216 22:04:50.190013 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:05:02 crc kubenswrapper[4777]: I0216 22:05:02.162056 4777 scope.go:117] "RemoveContainer" containerID="65228178cd903289646d3f1f31e24fc69e55721848afd119341bf3705d539e19" Feb 16 22:05:02 crc kubenswrapper[4777]: I0216 22:05:02.204820 4777 scope.go:117] "RemoveContainer" containerID="13f202891eddbe42db5953a9a4abdfb38766d69f3225c1265e219f95c90a09e4" Feb 16 22:05:02 crc kubenswrapper[4777]: E0216 22:05:02.205016 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:05:02 crc kubenswrapper[4777]: I0216 22:05:02.236782 4777 scope.go:117] "RemoveContainer" containerID="510f6b0d603b1f6f11444e526b6aaf6eb9e02c8b5586a53aaad0c39ba9014898" Feb 16 22:05:02 crc kubenswrapper[4777]: I0216 22:05:02.301680 4777 scope.go:117] "RemoveContainer" containerID="8235b8d1d9b22da04310169790fa837ce57f1ffee37a8026b1f976e0f1a0cb0c" Feb 16 22:05:03 crc kubenswrapper[4777]: I0216 22:05:03.182098 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:05:03 crc kubenswrapper[4777]: E0216 22:05:03.183034 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:05:14 crc kubenswrapper[4777]: E0216 22:05:14.186507 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:05:16 crc kubenswrapper[4777]: I0216 22:05:16.182557 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:05:16 crc kubenswrapper[4777]: E0216 22:05:16.183453 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.357698 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:18 crc kubenswrapper[4777]: E0216 22:05:18.358545 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="extract-content" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.358563 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="extract-content" Feb 16 22:05:18 crc kubenswrapper[4777]: E0216 22:05:18.358576 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="registry-server" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.358585 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="registry-server" Feb 16 22:05:18 crc kubenswrapper[4777]: E0216 22:05:18.358605 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="extract-utilities" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.358613 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="extract-utilities" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.358948 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d0e8a9-b40c-48d4-a1b0-0c2de37f2bc1" containerName="registry-server" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.360878 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.371418 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.520398 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvfc\" (UniqueName: \"kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.521011 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.521074 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.622575 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.622619 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.622664 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvfc\" (UniqueName: \"kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.623187 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.623532 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.642361 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvfc\" (UniqueName: \"kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc\") pod \"certified-operators-h95mh\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:18 crc kubenswrapper[4777]: I0216 22:05:18.707032 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:19 crc kubenswrapper[4777]: I0216 22:05:19.190303 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:19 crc kubenswrapper[4777]: I0216 22:05:19.368629 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerStarted","Data":"6a95ed8c98274295f0590d7e5785e9097a02b7848ce157b69e2f4e651caae91a"} Feb 16 22:05:20 crc kubenswrapper[4777]: I0216 22:05:20.387575 4777 generic.go:334] "Generic (PLEG): container finished" podID="198c33e0-315f-4bce-8ebf-58174312717a" containerID="68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b" exitCode=0 Feb 16 22:05:20 crc kubenswrapper[4777]: I0216 22:05:20.387650 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerDied","Data":"68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b"} Feb 16 22:05:20 crc kubenswrapper[4777]: I0216 22:05:20.391041 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:05:21 crc kubenswrapper[4777]: I0216 22:05:21.401910 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerStarted","Data":"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf"} Feb 16 22:05:21 crc kubenswrapper[4777]: E0216 22:05:21.589017 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod198c33e0_315f_4bce_8ebf_58174312717a.slice/crio-82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf.scope\": RecentStats: unable to find data in memory cache]" Feb 16 22:05:22 crc kubenswrapper[4777]: I0216 22:05:22.418338 4777 generic.go:334] "Generic (PLEG): container finished" podID="198c33e0-315f-4bce-8ebf-58174312717a" containerID="82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf" exitCode=0 Feb 16 22:05:22 crc kubenswrapper[4777]: I0216 22:05:22.418447 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerDied","Data":"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf"} Feb 16 22:05:23 crc kubenswrapper[4777]: I0216 22:05:23.532456 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerStarted","Data":"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3"} Feb 16 22:05:23 crc kubenswrapper[4777]: I0216 22:05:23.561321 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h95mh" podStartSLOduration=3.103474705 podStartE2EDuration="5.561303951s" podCreationTimestamp="2026-02-16 22:05:18 +0000 UTC" firstStartedPulling="2026-02-16 22:05:20.39063808 +0000 UTC m=+1640.973139192" lastFinishedPulling="2026-02-16 22:05:22.848467336 +0000 UTC m=+1643.430968438" observedRunningTime="2026-02-16 22:05:23.555261991 +0000 UTC m=+1644.137763103" watchObservedRunningTime="2026-02-16 22:05:23.561303951 +0000 UTC m=+1644.143805063" Feb 16 22:05:27 crc kubenswrapper[4777]: E0216 22:05:27.185005 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:05:28 crc kubenswrapper[4777]: I0216 22:05:28.707650 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:28 crc kubenswrapper[4777]: I0216 22:05:28.708103 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:28 crc kubenswrapper[4777]: I0216 22:05:28.766794 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:29 crc kubenswrapper[4777]: I0216 22:05:29.694849 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:29 crc kubenswrapper[4777]: I0216 22:05:29.766036 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:30 crc kubenswrapper[4777]: I0216 22:05:30.200117 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:05:30 crc kubenswrapper[4777]: E0216 22:05:30.200950 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:05:31 crc kubenswrapper[4777]: I0216 22:05:31.650515 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h95mh" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="registry-server" containerID="cri-o://ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3" gracePeriod=2 Feb 16 22:05:31 crc kubenswrapper[4777]: E0216 22:05:31.871354 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod198c33e0_315f_4bce_8ebf_58174312717a.slice/crio-ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3.scope\": RecentStats: unable to find data in memory cache]" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.237625 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.354130 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities\") pod \"198c33e0-315f-4bce-8ebf-58174312717a\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.354747 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content\") pod \"198c33e0-315f-4bce-8ebf-58174312717a\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.354835 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvfc\" (UniqueName: \"kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc\") pod \"198c33e0-315f-4bce-8ebf-58174312717a\" (UID: \"198c33e0-315f-4bce-8ebf-58174312717a\") " Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.355088 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities" (OuterVolumeSpecName: "utilities") pod "198c33e0-315f-4bce-8ebf-58174312717a" (UID: "198c33e0-315f-4bce-8ebf-58174312717a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.355539 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.362582 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc" (OuterVolumeSpecName: "kube-api-access-qfvfc") pod "198c33e0-315f-4bce-8ebf-58174312717a" (UID: "198c33e0-315f-4bce-8ebf-58174312717a"). InnerVolumeSpecName "kube-api-access-qfvfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.414757 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "198c33e0-315f-4bce-8ebf-58174312717a" (UID: "198c33e0-315f-4bce-8ebf-58174312717a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.456827 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198c33e0-315f-4bce-8ebf-58174312717a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.456873 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvfc\" (UniqueName: \"kubernetes.io/projected/198c33e0-315f-4bce-8ebf-58174312717a-kube-api-access-qfvfc\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.664420 4777 generic.go:334] "Generic (PLEG): container finished" podID="198c33e0-315f-4bce-8ebf-58174312717a" containerID="ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3" exitCode=0 Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.664514 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerDied","Data":"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3"} Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.664565 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h95mh" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.664608 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h95mh" event={"ID":"198c33e0-315f-4bce-8ebf-58174312717a","Type":"ContainerDied","Data":"6a95ed8c98274295f0590d7e5785e9097a02b7848ce157b69e2f4e651caae91a"} Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.664682 4777 scope.go:117] "RemoveContainer" containerID="ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.698693 4777 scope.go:117] "RemoveContainer" containerID="82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.732642 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.735977 4777 scope.go:117] "RemoveContainer" containerID="68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.745685 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h95mh"] Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.815596 4777 scope.go:117] "RemoveContainer" containerID="ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3" Feb 16 22:05:32 crc kubenswrapper[4777]: E0216 22:05:32.816842 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3\": container with ID starting with ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3 not found: ID does not exist" containerID="ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.816906 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3"} err="failed to get container status \"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3\": rpc error: code = NotFound desc = could not find container \"ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3\": container with ID starting with ebdc34d661a97f955ad00943f607b2b80c9f09c330b2b84deee80ef1389f3aa3 not found: ID does not exist" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.816949 4777 scope.go:117] "RemoveContainer" containerID="82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf" Feb 16 22:05:32 crc kubenswrapper[4777]: E0216 22:05:32.817324 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf\": container with ID starting with 82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf not found: ID does not exist" containerID="82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.817351 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf"} err="failed to get container status \"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf\": rpc error: code = NotFound desc = could not find container \"82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf\": container with ID starting with 82cec67b444dcd4b5327e3c7274b1f3d7ddbb11c56187e3aef83cf28b75d03cf not found: ID does not exist" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.817398 4777 scope.go:117] "RemoveContainer" containerID="68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b" Feb 16 22:05:32 crc kubenswrapper[4777]: E0216 22:05:32.817816 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b\": container with ID starting with 68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b not found: ID does not exist" containerID="68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b" Feb 16 22:05:32 crc kubenswrapper[4777]: I0216 22:05:32.817855 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b"} err="failed to get container status \"68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b\": rpc error: code = NotFound desc = could not find container \"68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b\": container with ID starting with 68d53e77a6b9203e8ec1aee7b38786d23d17a95cfd42b2ea0b14c0b4bfc8ef8b not found: ID does not exist" Feb 16 22:05:34 crc kubenswrapper[4777]: I0216 22:05:34.225164 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198c33e0-315f-4bce-8ebf-58174312717a" path="/var/lib/kubelet/pods/198c33e0-315f-4bce-8ebf-58174312717a/volumes" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.189159 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:35 crc kubenswrapper[4777]: E0216 22:05:35.189832 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="extract-utilities" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.189863 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="extract-utilities" Feb 16 22:05:35 crc kubenswrapper[4777]: E0216 22:05:35.189885 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="registry-server" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.189899 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="registry-server" Feb 16 22:05:35 crc kubenswrapper[4777]: E0216 22:05:35.189920 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="extract-content" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.189934 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="extract-content" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.190343 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="198c33e0-315f-4bce-8ebf-58174312717a" containerName="registry-server" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.193170 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.215484 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.345885 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw57g\" (UniqueName: \"kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.345999 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.346237 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.448588 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.448819 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw57g\" (UniqueName: \"kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.448910 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.449887 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.450582 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.481021 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw57g\" (UniqueName: \"kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g\") pod \"redhat-marketplace-g597d\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:35 crc kubenswrapper[4777]: I0216 22:05:35.522931 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:36 crc kubenswrapper[4777]: I0216 22:05:36.078452 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:36 crc kubenswrapper[4777]: W0216 22:05:36.094931 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bef989_7403_4b64_8b34_5105cd921de1.slice/crio-008fbabc0cc9c75d8566f8088d1579c43d42f231ff15cdcf838912467049cde0 WatchSource:0}: Error finding container 008fbabc0cc9c75d8566f8088d1579c43d42f231ff15cdcf838912467049cde0: Status 404 returned error can't find the container with id 008fbabc0cc9c75d8566f8088d1579c43d42f231ff15cdcf838912467049cde0 Feb 16 22:05:36 crc kubenswrapper[4777]: I0216 22:05:36.719270 4777 generic.go:334] "Generic (PLEG): container finished" podID="95bef989-7403-4b64-8b34-5105cd921de1" containerID="777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672" exitCode=0 Feb 16 22:05:36 crc kubenswrapper[4777]: I0216 22:05:36.719357 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerDied","Data":"777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672"} Feb 16 22:05:36 crc kubenswrapper[4777]: I0216 22:05:36.719420 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerStarted","Data":"008fbabc0cc9c75d8566f8088d1579c43d42f231ff15cdcf838912467049cde0"} Feb 16 22:05:37 crc kubenswrapper[4777]: I0216 22:05:37.736901 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerStarted","Data":"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a"} Feb 16 22:05:38 crc kubenswrapper[4777]: I0216 22:05:38.750430 4777 generic.go:334] "Generic (PLEG): container finished" podID="95bef989-7403-4b64-8b34-5105cd921de1" containerID="c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a" exitCode=0 Feb 16 22:05:38 crc kubenswrapper[4777]: I0216 22:05:38.750526 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerDied","Data":"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a"} Feb 16 22:05:38 crc kubenswrapper[4777]: I0216 22:05:38.750918 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerStarted","Data":"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375"} Feb 16 22:05:38 crc kubenswrapper[4777]: I0216 22:05:38.772021 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g597d" podStartSLOduration=2.362046469 podStartE2EDuration="3.772004778s" podCreationTimestamp="2026-02-16 22:05:35 +0000 UTC" firstStartedPulling="2026-02-16 22:05:36.723917527 +0000 UTC m=+1657.306418669" lastFinishedPulling="2026-02-16 22:05:38.133875836 +0000 UTC m=+1658.716376978" observedRunningTime="2026-02-16 22:05:38.768779517 +0000 UTC m=+1659.351280629" watchObservedRunningTime="2026-02-16 22:05:38.772004778 +0000 UTC m=+1659.354505890" Feb 16 22:05:42 crc kubenswrapper[4777]: E0216 22:05:42.198635 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:05:43 crc kubenswrapper[4777]: I0216 22:05:43.183111 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:05:43 crc kubenswrapper[4777]: E0216 22:05:43.184054 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:05:45 crc kubenswrapper[4777]: I0216 22:05:45.524202 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:45 crc kubenswrapper[4777]: I0216 22:05:45.524738 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:45 crc kubenswrapper[4777]: I0216 22:05:45.619756 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:45 crc kubenswrapper[4777]: I0216 22:05:45.902027 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:45 crc kubenswrapper[4777]: I0216 22:05:45.971650 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:47 crc kubenswrapper[4777]: I0216 22:05:47.867319 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g597d" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="registry-server" containerID="cri-o://1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375" gracePeriod=2 Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.410458 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.504314 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw57g\" (UniqueName: \"kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g\") pod \"95bef989-7403-4b64-8b34-5105cd921de1\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.504388 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content\") pod \"95bef989-7403-4b64-8b34-5105cd921de1\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.504593 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities\") pod \"95bef989-7403-4b64-8b34-5105cd921de1\" (UID: \"95bef989-7403-4b64-8b34-5105cd921de1\") " Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.506393 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities" (OuterVolumeSpecName: "utilities") pod "95bef989-7403-4b64-8b34-5105cd921de1" (UID: "95bef989-7403-4b64-8b34-5105cd921de1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.524998 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g" (OuterVolumeSpecName: "kube-api-access-bw57g") pod "95bef989-7403-4b64-8b34-5105cd921de1" (UID: "95bef989-7403-4b64-8b34-5105cd921de1"). InnerVolumeSpecName "kube-api-access-bw57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.544958 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95bef989-7403-4b64-8b34-5105cd921de1" (UID: "95bef989-7403-4b64-8b34-5105cd921de1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.607677 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw57g\" (UniqueName: \"kubernetes.io/projected/95bef989-7403-4b64-8b34-5105cd921de1-kube-api-access-bw57g\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.607738 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.607755 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95bef989-7403-4b64-8b34-5105cd921de1-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.886003 4777 generic.go:334] "Generic (PLEG): container finished" podID="95bef989-7403-4b64-8b34-5105cd921de1" containerID="1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375" exitCode=0 Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.886091 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerDied","Data":"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375"} Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.886420 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g597d" event={"ID":"95bef989-7403-4b64-8b34-5105cd921de1","Type":"ContainerDied","Data":"008fbabc0cc9c75d8566f8088d1579c43d42f231ff15cdcf838912467049cde0"} Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.886194 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g597d" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.886439 4777 scope.go:117] "RemoveContainer" containerID="1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.922078 4777 scope.go:117] "RemoveContainer" containerID="c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a" Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.941104 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.951784 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g597d"] Feb 16 22:05:48 crc kubenswrapper[4777]: I0216 22:05:48.964564 4777 scope.go:117] "RemoveContainer" containerID="777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.011424 4777 scope.go:117] "RemoveContainer" containerID="1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375" Feb 16 22:05:49 crc kubenswrapper[4777]: E0216 22:05:49.012011 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375\": container with ID starting with 1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375 not found: ID does not exist" containerID="1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.012058 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375"} err="failed to get container status \"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375\": rpc error: code = NotFound desc = could not find container \"1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375\": container with ID starting with 1876e3baaa326a0d8f1c25bcf6d12e057b5a8f5d1e8c9068fd91fd4105d1a375 not found: ID does not exist" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.012086 4777 scope.go:117] "RemoveContainer" containerID="c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a" Feb 16 22:05:49 crc kubenswrapper[4777]: E0216 22:05:49.012449 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a\": container with ID starting with c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a not found: ID does not exist" containerID="c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.012497 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a"} err="failed to get container status \"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a\": rpc error: code = NotFound desc = could not find container \"c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a\": container with ID starting with c56373bc3497242ea82f1a5f143029b628d8458170c56776c91e4d28b5298a1a not found: ID does not exist" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.012515 4777 scope.go:117] "RemoveContainer" containerID="777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672" Feb 16 22:05:49 crc kubenswrapper[4777]: E0216 22:05:49.012901 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672\": container with ID starting with 777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672 not found: ID does not exist" containerID="777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672" Feb 16 22:05:49 crc kubenswrapper[4777]: I0216 22:05:49.012939 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672"} err="failed to get container status \"777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672\": rpc error: code = NotFound desc = could not find container \"777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672\": container with ID starting with 777efbcee12ef482508f9443c8cef1bfb0ea0ad68d5291f7bd37f8a369feb672 not found: ID does not exist" Feb 16 22:05:50 crc kubenswrapper[4777]: I0216 22:05:50.203929 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bef989-7403-4b64-8b34-5105cd921de1" path="/var/lib/kubelet/pods/95bef989-7403-4b64-8b34-5105cd921de1/volumes" Feb 16 22:05:54 crc kubenswrapper[4777]: E0216 22:05:54.184350 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:05:55 crc kubenswrapper[4777]: I0216 22:05:55.181835 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:05:55 crc kubenswrapper[4777]: E0216 22:05:55.183174 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:06:08 crc kubenswrapper[4777]: E0216 22:06:08.184827 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:06:09 crc kubenswrapper[4777]: I0216 22:06:09.183314 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:06:09 crc kubenswrapper[4777]: E0216 22:06:09.184317 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:06:20 crc kubenswrapper[4777]: I0216 22:06:20.186558 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:06:20 crc kubenswrapper[4777]: E0216 22:06:20.187323 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:06:21 crc kubenswrapper[4777]: E0216 22:06:21.184484 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:06:34 crc kubenswrapper[4777]: E0216 22:06:34.184121 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:06:35 crc kubenswrapper[4777]: I0216 22:06:35.183440 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:06:35 crc kubenswrapper[4777]: E0216 22:06:35.184188 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:06:46 crc kubenswrapper[4777]: I0216 22:06:46.183069 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:06:46 crc kubenswrapper[4777]: E0216 22:06:46.184516 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:06:46 crc kubenswrapper[4777]: E0216 22:06:46.184749 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:06:57 crc kubenswrapper[4777]: E0216 22:06:57.185057 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:06:59 crc kubenswrapper[4777]: I0216 22:06:59.182342 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:06:59 crc kubenswrapper[4777]: E0216 22:06:59.182942 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:07:02 crc kubenswrapper[4777]: I0216 22:07:02.538743 4777 scope.go:117] "RemoveContainer" containerID="5c714ae03515451f58c19f4f755969d63cee1ae5ea290e98e98e2abeaaba0119" Feb 16 22:07:02 crc kubenswrapper[4777]: I0216 22:07:02.579374 4777 scope.go:117] "RemoveContainer" containerID="7f86f61bf328e1461a3a3cb5ab33db2178c5b3b20efc8fd3bf80d8b356ba838a" Feb 16 22:07:02 crc kubenswrapper[4777]: I0216 22:07:02.606165 4777 scope.go:117] "RemoveContainer" containerID="5bf312845e6a092d578609bd2019e50ffaff95942213bc94877e84e270983af5" Feb 16 22:07:10 crc kubenswrapper[4777]: E0216 22:07:10.193243 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:07:12 crc kubenswrapper[4777]: I0216 22:07:12.184507 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:07:12 crc kubenswrapper[4777]: E0216 22:07:12.185245 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:07:21 crc kubenswrapper[4777]: E0216 22:07:21.184040 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:07:27 crc kubenswrapper[4777]: I0216 22:07:27.182334 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:07:27 crc kubenswrapper[4777]: E0216 22:07:27.183436 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:07:35 crc kubenswrapper[4777]: E0216 22:07:35.183674 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.062951 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2f2h7"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.075326 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2f2h7"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.088197 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d3e5-account-create-update-gb48v"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.098407 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d3e5-account-create-update-gb48v"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.106333 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5ed2-account-create-update-bfgz5"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.114314 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-n67bs"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.122596 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gk52z"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.132208 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5ed2-account-create-update-bfgz5"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.142102 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gk52z"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.152122 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-n67bs"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.165698 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3613-account-create-update-g4zwl"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.179627 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3613-account-create-update-g4zwl"] Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.198971 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f33b75b-727b-44c0-a663-672eb02c8862" path="/var/lib/kubelet/pods/1f33b75b-727b-44c0-a663-672eb02c8862/volumes" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.199922 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e168960-b8cc-45a1-98e5-5b2157b299a2" path="/var/lib/kubelet/pods/2e168960-b8cc-45a1-98e5-5b2157b299a2/volumes" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.200624 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473b7fa7-2de5-4e0a-921e-80880017c429" path="/var/lib/kubelet/pods/473b7fa7-2de5-4e0a-921e-80880017c429/volumes" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.201375 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525a2ff3-f0fb-45fb-8231-e18cea438b9c" path="/var/lib/kubelet/pods/525a2ff3-f0fb-45fb-8231-e18cea438b9c/volumes" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.202760 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9eb4b5f-7a71-4f57-837b-73f4544c0a4a" path="/var/lib/kubelet/pods/e9eb4b5f-7a71-4f57-837b-73f4544c0a4a/volumes" Feb 16 22:07:38 crc kubenswrapper[4777]: I0216 22:07:38.203469 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53cb73a-1d77-4527-b9c5-34f2091972a3" path="/var/lib/kubelet/pods/f53cb73a-1d77-4527-b9c5-34f2091972a3/volumes" Feb 16 22:07:42 crc kubenswrapper[4777]: I0216 22:07:42.183317 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:07:42 crc kubenswrapper[4777]: E0216 22:07:42.184360 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:07:51 crc kubenswrapper[4777]: E0216 22:07:51.185460 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:07:53 crc kubenswrapper[4777]: I0216 22:07:53.182545 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:07:53 crc kubenswrapper[4777]: E0216 22:07:53.183586 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:00 crc kubenswrapper[4777]: I0216 22:08:00.045465 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2xkxn"] Feb 16 22:08:00 crc kubenswrapper[4777]: I0216 22:08:00.061569 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2xkxn"] Feb 16 22:08:00 crc kubenswrapper[4777]: I0216 22:08:00.202968 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbae3fdc-fa7e-41bc-8c73-8d126b476ba5" path="/var/lib/kubelet/pods/bbae3fdc-fa7e-41bc-8c73-8d126b476ba5/volumes" Feb 16 22:08:02 crc kubenswrapper[4777]: I0216 22:08:02.704921 4777 scope.go:117] "RemoveContainer" containerID="e6e0fac9a23539a5e1997ed9ef0cf2b5b25096a74896d26c9a4a784460fffcb8" Feb 16 22:08:02 crc kubenswrapper[4777]: I0216 22:08:02.760143 4777 scope.go:117] "RemoveContainer" containerID="58e1fae68ae66eb14d8ac2623ba9582eab0d8f91d406bfa69593cabf55636cb3" Feb 16 22:08:02 crc kubenswrapper[4777]: I0216 22:08:02.819737 4777 scope.go:117] "RemoveContainer" containerID="517f476878727fab47fc8e4916d29876152c9fc73b1534fed166306ffeff1cc1" Feb 16 22:08:02 crc kubenswrapper[4777]: I0216 22:08:02.878790 4777 scope.go:117] "RemoveContainer" containerID="d35620637999f5cd8ece8a4c7c76a47bf52b4ed6004267c2a839062312294c31" Feb 16 22:08:02 crc kubenswrapper[4777]: I0216 22:08:02.972683 4777 scope.go:117] "RemoveContainer" containerID="16e74f80d34047aae280d4cb00f66b7b04f522d3e15d239d403bdad590286693" Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.031925 4777 scope.go:117] "RemoveContainer" containerID="a8dd4a159b0dcc5a5eb78e45698ccdde0b68f7b0cc10ff560d6086af4b139ec4" Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.037509 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6597-account-create-update-8t7kc"] Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.054974 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-q9pnh"] Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.069553 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-q9pnh"] Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.079169 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6597-account-create-update-8t7kc"] Feb 16 22:08:03 crc kubenswrapper[4777]: I0216 22:08:03.080468 4777 scope.go:117] "RemoveContainer" containerID="019d0a474c1289621b07f9f000fb21ff4781d5d0d6fdf6d2e59f3aa45a1675f9" Feb 16 22:08:04 crc kubenswrapper[4777]: I0216 22:08:04.199098 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61160d7e-1d8e-47fa-94c6-02c654559bca" path="/var/lib/kubelet/pods/61160d7e-1d8e-47fa-94c6-02c654559bca/volumes" Feb 16 22:08:04 crc kubenswrapper[4777]: I0216 22:08:04.200455 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e859b095-dbc8-4bea-8d75-e576976dffaa" path="/var/lib/kubelet/pods/e859b095-dbc8-4bea-8d75-e576976dffaa/volumes" Feb 16 22:08:05 crc kubenswrapper[4777]: I0216 22:08:05.064809 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4faf-account-create-update-xz29p"] Feb 16 22:08:05 crc kubenswrapper[4777]: I0216 22:08:05.083039 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-20bd-account-create-update-pdrs4"] Feb 16 22:08:05 crc kubenswrapper[4777]: I0216 22:08:05.095687 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4faf-account-create-update-xz29p"] Feb 16 22:08:05 crc kubenswrapper[4777]: I0216 22:08:05.107795 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-20bd-account-create-update-pdrs4"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.053768 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-feeb-account-create-update-kf5rr"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.068854 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-feeb-account-create-update-kf5rr"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.079412 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-dqt5c"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.087410 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wxbd4"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.095414 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-dqt5c"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.103949 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-c9w56"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.111587 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wxbd4"] Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.119972 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-c9w56"] Feb 16 22:08:06 crc kubenswrapper[4777]: E0216 22:08:06.185385 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.195196 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc41bf6-4400-47d8-bada-837e28d0d42c" path="/var/lib/kubelet/pods/5dc41bf6-4400-47d8-bada-837e28d0d42c/volumes" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.196002 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6829d7a1-c6c5-4e57-be05-9b6335f1ad0f" path="/var/lib/kubelet/pods/6829d7a1-c6c5-4e57-be05-9b6335f1ad0f/volumes" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.196627 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c19c4b-cd3c-43b7-bb94-dba6086f3980" path="/var/lib/kubelet/pods/68c19c4b-cd3c-43b7-bb94-dba6086f3980/volumes" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.197261 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b567a40e-3ae3-4d2d-be6c-25d0e1175711" path="/var/lib/kubelet/pods/b567a40e-3ae3-4d2d-be6c-25d0e1175711/volumes" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.198322 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2488e5-f431-4464-b9f9-f549322948d1" path="/var/lib/kubelet/pods/cf2488e5-f431-4464-b9f9-f549322948d1/volumes" Feb 16 22:08:06 crc kubenswrapper[4777]: I0216 22:08:06.198870 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5063ed0-49f5-4947-af98-867246696986" path="/var/lib/kubelet/pods/e5063ed0-49f5-4947-af98-867246696986/volumes" Feb 16 22:08:08 crc kubenswrapper[4777]: I0216 22:08:08.182161 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:08:08 crc kubenswrapper[4777]: E0216 22:08:08.182899 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:09 crc kubenswrapper[4777]: I0216 22:08:09.034545 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fwhrq"] Feb 16 22:08:09 crc kubenswrapper[4777]: I0216 22:08:09.048704 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fwhrq"] Feb 16 22:08:10 crc kubenswrapper[4777]: I0216 22:08:10.205394 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865f0225-f1e8-4fb3-bc27-df7bfdb04f8c" path="/var/lib/kubelet/pods/865f0225-f1e8-4fb3-bc27-df7bfdb04f8c/volumes" Feb 16 22:08:14 crc kubenswrapper[4777]: I0216 22:08:14.047488 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-h4sfw"] Feb 16 22:08:14 crc kubenswrapper[4777]: I0216 22:08:14.064036 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-h4sfw"] Feb 16 22:08:14 crc kubenswrapper[4777]: I0216 22:08:14.204236 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dde5180-41b7-4ccd-b5bf-d144b205d163" path="/var/lib/kubelet/pods/5dde5180-41b7-4ccd-b5bf-d144b205d163/volumes" Feb 16 22:08:17 crc kubenswrapper[4777]: E0216 22:08:17.184741 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:08:19 crc kubenswrapper[4777]: I0216 22:08:19.182640 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:08:19 crc kubenswrapper[4777]: E0216 22:08:19.183468 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:31 crc kubenswrapper[4777]: I0216 22:08:31.182231 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:08:31 crc kubenswrapper[4777]: E0216 22:08:31.183300 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:32 crc kubenswrapper[4777]: E0216 22:08:32.184574 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:08:43 crc kubenswrapper[4777]: I0216 22:08:43.045397 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p4v8h"] Feb 16 22:08:43 crc kubenswrapper[4777]: I0216 22:08:43.055156 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p4v8h"] Feb 16 22:08:44 crc kubenswrapper[4777]: I0216 22:08:44.203840 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6681c372-76f5-4242-a533-0db4f1e711c8" path="/var/lib/kubelet/pods/6681c372-76f5-4242-a533-0db4f1e711c8/volumes" Feb 16 22:08:46 crc kubenswrapper[4777]: I0216 22:08:46.061890 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pgthg"] Feb 16 22:08:46 crc kubenswrapper[4777]: I0216 22:08:46.075376 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pgthg"] Feb 16 22:08:46 crc kubenswrapper[4777]: I0216 22:08:46.185066 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:08:46 crc kubenswrapper[4777]: E0216 22:08:46.185287 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:46 crc kubenswrapper[4777]: I0216 22:08:46.197651 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332f9253-5c7a-4cc9-aead-9adc1fe86b2e" path="/var/lib/kubelet/pods/332f9253-5c7a-4cc9-aead-9adc1fe86b2e/volumes" Feb 16 22:08:47 crc kubenswrapper[4777]: E0216 22:08:47.184867 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:08:52 crc kubenswrapper[4777]: I0216 22:08:52.046296 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-njzfj"] Feb 16 22:08:52 crc kubenswrapper[4777]: I0216 22:08:52.069218 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-njzfj"] Feb 16 22:08:52 crc kubenswrapper[4777]: I0216 22:08:52.214086 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7010b1ce-fc40-4fc8-ae63-53368dfd55f9" path="/var/lib/kubelet/pods/7010b1ce-fc40-4fc8-ae63-53368dfd55f9/volumes" Feb 16 22:08:57 crc kubenswrapper[4777]: I0216 22:08:57.182791 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:08:57 crc kubenswrapper[4777]: E0216 22:08:57.184117 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:08:59 crc kubenswrapper[4777]: I0216 22:08:59.042071 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pl2w2"] Feb 16 22:08:59 crc kubenswrapper[4777]: I0216 22:08:59.052600 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pl2w2"] Feb 16 22:09:00 crc kubenswrapper[4777]: E0216 22:09:00.200311 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:09:00 crc kubenswrapper[4777]: I0216 22:09:00.201566 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4d7671-062a-4647-9a54-6933f7cc3a4d" path="/var/lib/kubelet/pods/4a4d7671-062a-4647-9a54-6933f7cc3a4d/volumes" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.237992 4777 scope.go:117] "RemoveContainer" containerID="23005eefba37937f6630df4696c151df8a7e93d03a7172ab3fa1f13f5dc8d1f7" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.279540 4777 scope.go:117] "RemoveContainer" containerID="e17565b4ac5f1ccb765fc317a61ae3cb0a819c77fa7ff30afb09275ac294991e" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.346688 4777 scope.go:117] "RemoveContainer" containerID="97ddbf1da018ec47aa1e601737e5ff32b412d656f33dc531197a111f70258c46" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.400544 4777 scope.go:117] "RemoveContainer" containerID="2cbedc7a5b8b00e413c87ba7d9ebd19f5a7d93ebb97a6344216d412100b2034d" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.429798 4777 scope.go:117] "RemoveContainer" containerID="eca6c7b5b9ce4f355f27e7fa3c6f275807e1bbaf26835460fab1390227518218" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.505806 4777 scope.go:117] "RemoveContainer" containerID="8431a8095a25d5d0417790cd948dc752bc6f939f39142f76bfd175d99efadf92" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.533456 4777 scope.go:117] "RemoveContainer" containerID="d0686f70a8a7c62232275422cf5ebd883e6d555c19b952c8f5f33f9dfab3e888" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.581683 4777 scope.go:117] "RemoveContainer" containerID="62efaf26c417064b51a69213ea0542cc2e59224f3af56ea1395dbeb01d212e7a" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.602511 4777 scope.go:117] "RemoveContainer" containerID="82212abdbb1c19980082ee4ecdb242e576a429f845cee036d68c272abfb7d41a" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.630050 4777 scope.go:117] "RemoveContainer" containerID="f5c6f3186946a134960253907604d5742844e72df76628471d7423224b92ab9d" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.649564 4777 scope.go:117] "RemoveContainer" containerID="8c3c4e6df7d757d6543c535e83a20260f43671d7f30177c3e175f3c230eca68a" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.678231 4777 scope.go:117] "RemoveContainer" containerID="fbcb8df8f0c47ccb68ea6be776538a34e1adffce08b1bf040774222cce989117" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.700604 4777 scope.go:117] "RemoveContainer" containerID="6ced8a5ba17e392ecffdea1ece914209ced3a576c5624f5039c6f2cb8bb093bf" Feb 16 22:09:03 crc kubenswrapper[4777]: I0216 22:09:03.732360 4777 scope.go:117] "RemoveContainer" containerID="3276a0c72c1bf3608f60770961e05b9c8139e26f26c6943340511ee32613ca2d" Feb 16 22:09:08 crc kubenswrapper[4777]: I0216 22:09:08.056976 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-z9tj6"] Feb 16 22:09:08 crc kubenswrapper[4777]: I0216 22:09:08.066834 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-z9tj6"] Feb 16 22:09:08 crc kubenswrapper[4777]: I0216 22:09:08.197033 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626f6429-977f-4c1f-b055-3502cb530645" path="/var/lib/kubelet/pods/626f6429-977f-4c1f-b055-3502cb530645/volumes" Feb 16 22:09:12 crc kubenswrapper[4777]: I0216 22:09:12.183212 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:09:12 crc kubenswrapper[4777]: I0216 22:09:12.814855 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e"} Feb 16 22:09:14 crc kubenswrapper[4777]: E0216 22:09:14.320320 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:09:14 crc kubenswrapper[4777]: E0216 22:09:14.320867 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:09:14 crc kubenswrapper[4777]: E0216 22:09:14.321113 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:09:14 crc kubenswrapper[4777]: E0216 22:09:14.322454 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:09:29 crc kubenswrapper[4777]: E0216 22:09:29.183672 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.070802 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tgbml"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.087873 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dbc0-account-create-update-l6rwv"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.099850 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-dbc0-account-create-update-l6rwv"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.121497 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tgbml"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.136116 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f289-account-create-update-79rg7"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.145903 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5dkfq"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.153895 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dmsfj"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.161566 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1962-account-create-update-sdcsp"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.168431 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5dkfq"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.175151 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f289-account-create-update-79rg7"] Feb 16 22:09:44 crc kubenswrapper[4777]: E0216 22:09:44.184898 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.201143 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095f5bf0-5bb9-42b3-ae28-a7bef2475cfc" path="/var/lib/kubelet/pods/095f5bf0-5bb9-42b3-ae28-a7bef2475cfc/volumes" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.201736 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1218f6-7d46-4c08-be98-e27104f96caf" path="/var/lib/kubelet/pods/5a1218f6-7d46-4c08-be98-e27104f96caf/volumes" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.202248 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df482342-2142-4778-9751-7f8c8daaccd8" path="/var/lib/kubelet/pods/df482342-2142-4778-9751-7f8c8daaccd8/volumes" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.202893 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22e9b3b-7b09-4704-a1aa-b219e416360c" path="/var/lib/kubelet/pods/e22e9b3b-7b09-4704-a1aa-b219e416360c/volumes" Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.203887 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dmsfj"] Feb 16 22:09:44 crc kubenswrapper[4777]: I0216 22:09:44.203919 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1962-account-create-update-sdcsp"] Feb 16 22:09:46 crc kubenswrapper[4777]: I0216 22:09:46.206782 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c47b5e-f997-4751-9fe6-e35d904a7fea" path="/var/lib/kubelet/pods/54c47b5e-f997-4751-9fe6-e35d904a7fea/volumes" Feb 16 22:09:46 crc kubenswrapper[4777]: I0216 22:09:46.209238 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95770934-2a37-4fc6-b3e4-5ffd1e429f4b" path="/var/lib/kubelet/pods/95770934-2a37-4fc6-b3e4-5ffd1e429f4b/volumes" Feb 16 22:09:57 crc kubenswrapper[4777]: E0216 22:09:57.186429 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.004202 4777 scope.go:117] "RemoveContainer" containerID="2f01910c40c0fdb6027449b04396c1157e522b9ec02b91bbdea84270de3cebf6" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.045667 4777 scope.go:117] "RemoveContainer" containerID="1235d5d4c6e5382d1761fa3b85dc0a06c75a0964a0b89fb30d77cca1335797c6" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.115425 4777 scope.go:117] "RemoveContainer" containerID="bc31cbc847db042c93eb6192ca824981d5348d547d3db2fdbfbeb31c94b60cb3" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.154383 4777 scope.go:117] "RemoveContainer" containerID="c8b8f4f19e1088f06bd68a586cb548410b29090ae7567e1811d9628dd9caf510" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.271187 4777 scope.go:117] "RemoveContainer" containerID="1ecb2e533d2619bf146c9f16fc7067bb6942f1d1dfb3d90bad41d0b0787da01a" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.306626 4777 scope.go:117] "RemoveContainer" containerID="bef5da1ad3c47056bea28d72ac12503974c64db3a6f5595d2a02caa0fa368e65" Feb 16 22:10:04 crc kubenswrapper[4777]: I0216 22:10:04.352211 4777 scope.go:117] "RemoveContainer" containerID="777a58515f8a44311bc51bc1a7a4dee89fc2bb699cd9f1aa52226d197b03d0dc" Feb 16 22:10:10 crc kubenswrapper[4777]: E0216 22:10:10.198511 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:10:11 crc kubenswrapper[4777]: I0216 22:10:11.096842 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7jcx"] Feb 16 22:10:11 crc kubenswrapper[4777]: I0216 22:10:11.119505 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h7jcx"] Feb 16 22:10:12 crc kubenswrapper[4777]: I0216 22:10:12.210849 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2997f9cf-78c3-458d-b5c4-7c774d1c84a8" path="/var/lib/kubelet/pods/2997f9cf-78c3-458d-b5c4-7c774d1c84a8/volumes" Feb 16 22:10:22 crc kubenswrapper[4777]: E0216 22:10:22.184957 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:10:31 crc kubenswrapper[4777]: I0216 22:10:31.058694 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2gskq"] Feb 16 22:10:31 crc kubenswrapper[4777]: I0216 22:10:31.072077 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m5pdh"] Feb 16 22:10:31 crc kubenswrapper[4777]: I0216 22:10:31.083569 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m5pdh"] Feb 16 22:10:31 crc kubenswrapper[4777]: I0216 22:10:31.092920 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2gskq"] Feb 16 22:10:32 crc kubenswrapper[4777]: I0216 22:10:32.195563 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd16d27-1038-4aba-89fa-c789dfb631af" path="/var/lib/kubelet/pods/2cd16d27-1038-4aba-89fa-c789dfb631af/volumes" Feb 16 22:10:32 crc kubenswrapper[4777]: I0216 22:10:32.197064 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f6de59-67d3-49c4-983f-5352c6178e5c" path="/var/lib/kubelet/pods/b6f6de59-67d3-49c4-983f-5352c6178e5c/volumes" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.476223 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:33 crc kubenswrapper[4777]: E0216 22:10:33.478686 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="extract-content" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.478863 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="extract-content" Feb 16 22:10:33 crc kubenswrapper[4777]: E0216 22:10:33.479014 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="extract-utilities" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.479122 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="extract-utilities" Feb 16 22:10:33 crc kubenswrapper[4777]: E0216 22:10:33.479248 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="registry-server" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.479353 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="registry-server" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.479855 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bef989-7403-4b64-8b34-5105cd921de1" containerName="registry-server" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.482537 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.489862 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.587664 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.587786 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4b8c\" (UniqueName: \"kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.587873 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.690587 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.690679 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4b8c\" (UniqueName: \"kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.690780 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.691398 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.691682 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.721152 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4b8c\" (UniqueName: \"kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c\") pod \"community-operators-6l8vv\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:33 crc kubenswrapper[4777]: I0216 22:10:33.852453 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:34 crc kubenswrapper[4777]: E0216 22:10:34.183353 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:10:34 crc kubenswrapper[4777]: W0216 22:10:34.351609 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad327ec_dfa6_4132_a93f_bb7758b2cc6f.slice/crio-e2f1f7fb54a3045beaf1db48ae60d7f9721e062859cb382c24eb6383b362456d WatchSource:0}: Error finding container e2f1f7fb54a3045beaf1db48ae60d7f9721e062859cb382c24eb6383b362456d: Status 404 returned error can't find the container with id e2f1f7fb54a3045beaf1db48ae60d7f9721e062859cb382c24eb6383b362456d Feb 16 22:10:34 crc kubenswrapper[4777]: I0216 22:10:34.356764 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:34 crc kubenswrapper[4777]: I0216 22:10:34.830890 4777 generic.go:334] "Generic (PLEG): container finished" podID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerID="f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623" exitCode=0 Feb 16 22:10:34 crc kubenswrapper[4777]: I0216 22:10:34.831117 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerDied","Data":"f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623"} Feb 16 22:10:34 crc kubenswrapper[4777]: I0216 22:10:34.831343 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerStarted","Data":"e2f1f7fb54a3045beaf1db48ae60d7f9721e062859cb382c24eb6383b362456d"} Feb 16 22:10:34 crc kubenswrapper[4777]: I0216 22:10:34.834252 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:10:36 crc kubenswrapper[4777]: I0216 22:10:36.854784 4777 generic.go:334] "Generic (PLEG): container finished" podID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerID="6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1" exitCode=0 Feb 16 22:10:36 crc kubenswrapper[4777]: I0216 22:10:36.854872 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerDied","Data":"6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1"} Feb 16 22:10:37 crc kubenswrapper[4777]: I0216 22:10:37.877522 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerStarted","Data":"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0"} Feb 16 22:10:37 crc kubenswrapper[4777]: I0216 22:10:37.908561 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6l8vv" podStartSLOduration=2.50715909 podStartE2EDuration="4.908531295s" podCreationTimestamp="2026-02-16 22:10:33 +0000 UTC" firstStartedPulling="2026-02-16 22:10:34.833680826 +0000 UTC m=+1955.416181968" lastFinishedPulling="2026-02-16 22:10:37.235053061 +0000 UTC m=+1957.817554173" observedRunningTime="2026-02-16 22:10:37.903375658 +0000 UTC m=+1958.485876800" watchObservedRunningTime="2026-02-16 22:10:37.908531295 +0000 UTC m=+1958.491032437" Feb 16 22:10:43 crc kubenswrapper[4777]: I0216 22:10:43.852793 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:43 crc kubenswrapper[4777]: I0216 22:10:43.853378 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:43 crc kubenswrapper[4777]: I0216 22:10:43.943279 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:44 crc kubenswrapper[4777]: I0216 22:10:44.055683 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:44 crc kubenswrapper[4777]: I0216 22:10:44.206107 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:45 crc kubenswrapper[4777]: I0216 22:10:45.991053 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6l8vv" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="registry-server" containerID="cri-o://752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0" gracePeriod=2 Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.556700 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.734791 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content\") pod \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.734870 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4b8c\" (UniqueName: \"kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c\") pod \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.734896 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities\") pod \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\" (UID: \"fad327ec-dfa6-4132-a93f-bb7758b2cc6f\") " Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.736511 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities" (OuterVolumeSpecName: "utilities") pod "fad327ec-dfa6-4132-a93f-bb7758b2cc6f" (UID: "fad327ec-dfa6-4132-a93f-bb7758b2cc6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.739844 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c" (OuterVolumeSpecName: "kube-api-access-k4b8c") pod "fad327ec-dfa6-4132-a93f-bb7758b2cc6f" (UID: "fad327ec-dfa6-4132-a93f-bb7758b2cc6f"). InnerVolumeSpecName "kube-api-access-k4b8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.837708 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4b8c\" (UniqueName: \"kubernetes.io/projected/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-kube-api-access-k4b8c\") on node \"crc\" DevicePath \"\"" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.837758 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.888153 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fad327ec-dfa6-4132-a93f-bb7758b2cc6f" (UID: "fad327ec-dfa6-4132-a93f-bb7758b2cc6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:10:46 crc kubenswrapper[4777]: I0216 22:10:46.939291 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fad327ec-dfa6-4132-a93f-bb7758b2cc6f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.005593 4777 generic.go:334] "Generic (PLEG): container finished" podID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerID="752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0" exitCode=0 Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.005663 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6l8vv" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.005662 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerDied","Data":"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0"} Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.005781 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6l8vv" event={"ID":"fad327ec-dfa6-4132-a93f-bb7758b2cc6f","Type":"ContainerDied","Data":"e2f1f7fb54a3045beaf1db48ae60d7f9721e062859cb382c24eb6383b362456d"} Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.005820 4777 scope.go:117] "RemoveContainer" containerID="752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.051464 4777 scope.go:117] "RemoveContainer" containerID="6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.056453 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.081013 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6l8vv"] Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.107185 4777 scope.go:117] "RemoveContainer" containerID="f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.143592 4777 scope.go:117] "RemoveContainer" containerID="752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0" Feb 16 22:10:47 crc kubenswrapper[4777]: E0216 22:10:47.144817 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0\": container with ID starting with 752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0 not found: ID does not exist" containerID="752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.144888 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0"} err="failed to get container status \"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0\": rpc error: code = NotFound desc = could not find container \"752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0\": container with ID starting with 752f97f2e18bb8e8b407e64f416f172f8f4d22d97c2abb0ec8450c26c722e7d0 not found: ID does not exist" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.144931 4777 scope.go:117] "RemoveContainer" containerID="6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1" Feb 16 22:10:47 crc kubenswrapper[4777]: E0216 22:10:47.145538 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1\": container with ID starting with 6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1 not found: ID does not exist" containerID="6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.145625 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1"} err="failed to get container status \"6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1\": rpc error: code = NotFound desc = could not find container \"6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1\": container with ID starting with 6d53045f581858bff0e6c27f66f7cedd0795d26d711a714c79436abd9d33f4d1 not found: ID does not exist" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.145690 4777 scope.go:117] "RemoveContainer" containerID="f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623" Feb 16 22:10:47 crc kubenswrapper[4777]: E0216 22:10:47.146212 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623\": container with ID starting with f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623 not found: ID does not exist" containerID="f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623" Feb 16 22:10:47 crc kubenswrapper[4777]: I0216 22:10:47.146257 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623"} err="failed to get container status \"f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623\": rpc error: code = NotFound desc = could not find container \"f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623\": container with ID starting with f3fdd3c4e9f097b607e7fe2381e3e09ce2814d592fc677fe7ee3d1f12b7ee623 not found: ID does not exist" Feb 16 22:10:48 crc kubenswrapper[4777]: E0216 22:10:48.183814 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:10:48 crc kubenswrapper[4777]: I0216 22:10:48.201128 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" path="/var/lib/kubelet/pods/fad327ec-dfa6-4132-a93f-bb7758b2cc6f/volumes" Feb 16 22:11:03 crc kubenswrapper[4777]: E0216 22:11:03.184954 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:11:04 crc kubenswrapper[4777]: I0216 22:11:04.511254 4777 scope.go:117] "RemoveContainer" containerID="b29243746f7ac132751aa74466f197b11898a9e98a2b8b1009dc3e5eecb977a6" Feb 16 22:11:04 crc kubenswrapper[4777]: I0216 22:11:04.582905 4777 scope.go:117] "RemoveContainer" containerID="b23ad5eb9ad435dd251c02ab2d2791e4c6c439edb4a3737c0f08f4ab7cb7beaa" Feb 16 22:11:04 crc kubenswrapper[4777]: I0216 22:11:04.636051 4777 scope.go:117] "RemoveContainer" containerID="199065924a96a5dec9434b19498eab9aab5d5463dee5bb8751d6e560bd611b3b" Feb 16 22:11:15 crc kubenswrapper[4777]: E0216 22:11:15.185775 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:11:16 crc kubenswrapper[4777]: I0216 22:11:16.075359 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-l98fl"] Feb 16 22:11:16 crc kubenswrapper[4777]: I0216 22:11:16.089925 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-l98fl"] Feb 16 22:11:16 crc kubenswrapper[4777]: I0216 22:11:16.203385 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee894943-d453-4455-be80-81a3c20ad9de" path="/var/lib/kubelet/pods/ee894943-d453-4455-be80-81a3c20ad9de/volumes" Feb 16 22:11:28 crc kubenswrapper[4777]: E0216 22:11:28.185508 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:11:39 crc kubenswrapper[4777]: E0216 22:11:39.184501 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:11:41 crc kubenswrapper[4777]: I0216 22:11:41.651486 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:11:41 crc kubenswrapper[4777]: I0216 22:11:41.652829 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:11:54 crc kubenswrapper[4777]: E0216 22:11:54.185241 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:12:04 crc kubenswrapper[4777]: I0216 22:12:04.791433 4777 scope.go:117] "RemoveContainer" containerID="c0bca3a547bb50a2025f4e724de9e912dc95e5a230fb99abfaf552efaa427df9" Feb 16 22:12:05 crc kubenswrapper[4777]: E0216 22:12:05.183822 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:12:11 crc kubenswrapper[4777]: I0216 22:12:11.651542 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:12:11 crc kubenswrapper[4777]: I0216 22:12:11.652111 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:12:18 crc kubenswrapper[4777]: E0216 22:12:18.185739 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.789033 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:23 crc kubenswrapper[4777]: E0216 22:12:23.790363 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="extract-content" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.790387 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="extract-content" Feb 16 22:12:23 crc kubenswrapper[4777]: E0216 22:12:23.790410 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="extract-utilities" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.790421 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="extract-utilities" Feb 16 22:12:23 crc kubenswrapper[4777]: E0216 22:12:23.790436 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="registry-server" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.790448 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="registry-server" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.790808 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad327ec-dfa6-4132-a93f-bb7758b2cc6f" containerName="registry-server" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.792938 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.800386 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.928192 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.928247 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:23 crc kubenswrapper[4777]: I0216 22:12:23.928273 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm92x\" (UniqueName: \"kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.030900 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.031200 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.031361 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm92x\" (UniqueName: \"kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.031472 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.031485 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.049660 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm92x\" (UniqueName: \"kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x\") pod \"redhat-operators-2qbdn\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.127488 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:24 crc kubenswrapper[4777]: I0216 22:12:24.597684 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:24 crc kubenswrapper[4777]: W0216 22:12:24.602272 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014399af_6c75_41f6_acd9_f7f43d84dffd.slice/crio-d2466c57e0ca07afe56565bb9661159d3099db1e24ee7ce484a6867a4e3d29c2 WatchSource:0}: Error finding container d2466c57e0ca07afe56565bb9661159d3099db1e24ee7ce484a6867a4e3d29c2: Status 404 returned error can't find the container with id d2466c57e0ca07afe56565bb9661159d3099db1e24ee7ce484a6867a4e3d29c2 Feb 16 22:12:25 crc kubenswrapper[4777]: I0216 22:12:25.214019 4777 generic.go:334] "Generic (PLEG): container finished" podID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerID="3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32" exitCode=0 Feb 16 22:12:25 crc kubenswrapper[4777]: I0216 22:12:25.214217 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerDied","Data":"3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32"} Feb 16 22:12:25 crc kubenswrapper[4777]: I0216 22:12:25.214335 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerStarted","Data":"d2466c57e0ca07afe56565bb9661159d3099db1e24ee7ce484a6867a4e3d29c2"} Feb 16 22:12:27 crc kubenswrapper[4777]: I0216 22:12:27.236569 4777 generic.go:334] "Generic (PLEG): container finished" podID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerID="c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b" exitCode=0 Feb 16 22:12:27 crc kubenswrapper[4777]: I0216 22:12:27.236837 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerDied","Data":"c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b"} Feb 16 22:12:28 crc kubenswrapper[4777]: I0216 22:12:28.253417 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerStarted","Data":"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520"} Feb 16 22:12:28 crc kubenswrapper[4777]: I0216 22:12:28.280411 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qbdn" podStartSLOduration=2.85819886 podStartE2EDuration="5.280393635s" podCreationTimestamp="2026-02-16 22:12:23 +0000 UTC" firstStartedPulling="2026-02-16 22:12:25.215703694 +0000 UTC m=+2065.798204796" lastFinishedPulling="2026-02-16 22:12:27.637898439 +0000 UTC m=+2068.220399571" observedRunningTime="2026-02-16 22:12:28.279945022 +0000 UTC m=+2068.862446164" watchObservedRunningTime="2026-02-16 22:12:28.280393635 +0000 UTC m=+2068.862894737" Feb 16 22:12:30 crc kubenswrapper[4777]: E0216 22:12:30.202254 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:12:34 crc kubenswrapper[4777]: I0216 22:12:34.127807 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:34 crc kubenswrapper[4777]: I0216 22:12:34.129052 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:35 crc kubenswrapper[4777]: I0216 22:12:35.197575 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2qbdn" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="registry-server" probeResult="failure" output=< Feb 16 22:12:35 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 22:12:35 crc kubenswrapper[4777]: > Feb 16 22:12:41 crc kubenswrapper[4777]: I0216 22:12:41.652208 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:12:41 crc kubenswrapper[4777]: I0216 22:12:41.653043 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:12:41 crc kubenswrapper[4777]: I0216 22:12:41.653126 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:12:41 crc kubenswrapper[4777]: I0216 22:12:41.654234 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:12:41 crc kubenswrapper[4777]: I0216 22:12:41.654318 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e" gracePeriod=600 Feb 16 22:12:42 crc kubenswrapper[4777]: I0216 22:12:42.413737 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e" exitCode=0 Feb 16 22:12:42 crc kubenswrapper[4777]: I0216 22:12:42.413983 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e"} Feb 16 22:12:42 crc kubenswrapper[4777]: I0216 22:12:42.414353 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0"} Feb 16 22:12:42 crc kubenswrapper[4777]: I0216 22:12:42.414381 4777 scope.go:117] "RemoveContainer" containerID="a0c75dcb95fa35097d9c1c7b66e32f37157f6f5bea62801900b95618dfcf1bf2" Feb 16 22:12:44 crc kubenswrapper[4777]: E0216 22:12:44.184997 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:12:44 crc kubenswrapper[4777]: I0216 22:12:44.206637 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:44 crc kubenswrapper[4777]: I0216 22:12:44.274570 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:47 crc kubenswrapper[4777]: I0216 22:12:47.580868 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:47 crc kubenswrapper[4777]: I0216 22:12:47.581776 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qbdn" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="registry-server" containerID="cri-o://e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520" gracePeriod=2 Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.192710 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.309636 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content\") pod \"014399af-6c75-41f6-acd9-f7f43d84dffd\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.309768 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm92x\" (UniqueName: \"kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x\") pod \"014399af-6c75-41f6-acd9-f7f43d84dffd\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.310042 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities\") pod \"014399af-6c75-41f6-acd9-f7f43d84dffd\" (UID: \"014399af-6c75-41f6-acd9-f7f43d84dffd\") " Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.311309 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities" (OuterVolumeSpecName: "utilities") pod "014399af-6c75-41f6-acd9-f7f43d84dffd" (UID: "014399af-6c75-41f6-acd9-f7f43d84dffd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.318501 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x" (OuterVolumeSpecName: "kube-api-access-mm92x") pod "014399af-6c75-41f6-acd9-f7f43d84dffd" (UID: "014399af-6c75-41f6-acd9-f7f43d84dffd"). InnerVolumeSpecName "kube-api-access-mm92x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.411969 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.412001 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm92x\" (UniqueName: \"kubernetes.io/projected/014399af-6c75-41f6-acd9-f7f43d84dffd-kube-api-access-mm92x\") on node \"crc\" DevicePath \"\"" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.439235 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "014399af-6c75-41f6-acd9-f7f43d84dffd" (UID: "014399af-6c75-41f6-acd9-f7f43d84dffd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.483062 4777 generic.go:334] "Generic (PLEG): container finished" podID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerID="e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520" exitCode=0 Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.483106 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerDied","Data":"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520"} Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.483117 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qbdn" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.483135 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qbdn" event={"ID":"014399af-6c75-41f6-acd9-f7f43d84dffd","Type":"ContainerDied","Data":"d2466c57e0ca07afe56565bb9661159d3099db1e24ee7ce484a6867a4e3d29c2"} Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.483154 4777 scope.go:117] "RemoveContainer" containerID="e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.513736 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/014399af-6c75-41f6-acd9-f7f43d84dffd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.514703 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.518672 4777 scope.go:117] "RemoveContainer" containerID="c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.522891 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qbdn"] Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.540892 4777 scope.go:117] "RemoveContainer" containerID="3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.603115 4777 scope.go:117] "RemoveContainer" containerID="e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520" Feb 16 22:12:48 crc kubenswrapper[4777]: E0216 22:12:48.607989 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520\": container with ID starting with e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520 not found: ID does not exist" containerID="e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.608031 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520"} err="failed to get container status \"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520\": rpc error: code = NotFound desc = could not find container \"e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520\": container with ID starting with e17060934af9f426b2b927a55a003ec72ee9537422aa74cd5f0ee6c7caa46520 not found: ID does not exist" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.608058 4777 scope.go:117] "RemoveContainer" containerID="c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b" Feb 16 22:12:48 crc kubenswrapper[4777]: E0216 22:12:48.608539 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b\": container with ID starting with c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b not found: ID does not exist" containerID="c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.608574 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b"} err="failed to get container status \"c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b\": rpc error: code = NotFound desc = could not find container \"c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b\": container with ID starting with c9f4b7e95f36ed6de648e9beb5cc4688c6d5e83165699d52f900cb50cd7b0a7b not found: ID does not exist" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.608597 4777 scope.go:117] "RemoveContainer" containerID="3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32" Feb 16 22:12:48 crc kubenswrapper[4777]: E0216 22:12:48.608994 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32\": container with ID starting with 3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32 not found: ID does not exist" containerID="3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32" Feb 16 22:12:48 crc kubenswrapper[4777]: I0216 22:12:48.609042 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32"} err="failed to get container status \"3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32\": rpc error: code = NotFound desc = could not find container \"3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32\": container with ID starting with 3f80bc93c1372942fc3355ab88eb049b0dcdfdde5e7038604b72173ae6d6ae32 not found: ID does not exist" Feb 16 22:12:50 crc kubenswrapper[4777]: I0216 22:12:50.191130 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" path="/var/lib/kubelet/pods/014399af-6c75-41f6-acd9-f7f43d84dffd/volumes" Feb 16 22:12:55 crc kubenswrapper[4777]: E0216 22:12:55.184838 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:13:08 crc kubenswrapper[4777]: E0216 22:13:08.186454 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:13:20 crc kubenswrapper[4777]: E0216 22:13:20.191181 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:13:35 crc kubenswrapper[4777]: E0216 22:13:35.184190 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:13:46 crc kubenswrapper[4777]: E0216 22:13:46.184757 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:13:58 crc kubenswrapper[4777]: E0216 22:13:58.184540 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:14:10 crc kubenswrapper[4777]: E0216 22:14:10.193638 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:14:22 crc kubenswrapper[4777]: E0216 22:14:22.305057 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:14:22 crc kubenswrapper[4777]: E0216 22:14:22.305555 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:14:22 crc kubenswrapper[4777]: E0216 22:14:22.305690 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:14:22 crc kubenswrapper[4777]: E0216 22:14:22.306887 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:14:38 crc kubenswrapper[4777]: E0216 22:14:38.184265 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:14:52 crc kubenswrapper[4777]: E0216 22:14:52.184330 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.169984 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58"] Feb 16 22:15:00 crc kubenswrapper[4777]: E0216 22:15:00.171442 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="extract-content" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.171471 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="extract-content" Feb 16 22:15:00 crc kubenswrapper[4777]: E0216 22:15:00.171509 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="extract-utilities" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.171528 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="extract-utilities" Feb 16 22:15:00 crc kubenswrapper[4777]: E0216 22:15:00.171560 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="registry-server" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.171578 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="registry-server" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.172034 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="014399af-6c75-41f6-acd9-f7f43d84dffd" containerName="registry-server" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.173466 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.177400 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.177930 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.217943 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58"] Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.334528 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj785\" (UniqueName: \"kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.334954 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.335033 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.437729 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj785\" (UniqueName: \"kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.437856 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.437878 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.439043 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.454993 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.469942 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj785\" (UniqueName: \"kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785\") pod \"collect-profiles-29521335-c4m58\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:00 crc kubenswrapper[4777]: I0216 22:15:00.510790 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:01 crc kubenswrapper[4777]: I0216 22:15:01.020648 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58"] Feb 16 22:15:01 crc kubenswrapper[4777]: I0216 22:15:01.809773 4777 generic.go:334] "Generic (PLEG): container finished" podID="7e52b67e-7aae-4859-bdda-13aa6e4859a3" containerID="47b2e5721bd39c2360bfb9d5e676649719a7d74aea2c83c728248dce96105801" exitCode=0 Feb 16 22:15:01 crc kubenswrapper[4777]: I0216 22:15:01.811417 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" event={"ID":"7e52b67e-7aae-4859-bdda-13aa6e4859a3","Type":"ContainerDied","Data":"47b2e5721bd39c2360bfb9d5e676649719a7d74aea2c83c728248dce96105801"} Feb 16 22:15:01 crc kubenswrapper[4777]: I0216 22:15:01.811969 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" event={"ID":"7e52b67e-7aae-4859-bdda-13aa6e4859a3","Type":"ContainerStarted","Data":"e07b6b4b08bff6df799dd694a55d2a078dc35b1e7a950cff55c4cb920d8ff3db"} Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.259029 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.412660 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj785\" (UniqueName: \"kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785\") pod \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.413021 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume\") pod \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.413053 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume\") pod \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\" (UID: \"7e52b67e-7aae-4859-bdda-13aa6e4859a3\") " Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.413784 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e52b67e-7aae-4859-bdda-13aa6e4859a3" (UID: "7e52b67e-7aae-4859-bdda-13aa6e4859a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.421475 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785" (OuterVolumeSpecName: "kube-api-access-bj785") pod "7e52b67e-7aae-4859-bdda-13aa6e4859a3" (UID: "7e52b67e-7aae-4859-bdda-13aa6e4859a3"). InnerVolumeSpecName "kube-api-access-bj785". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.424573 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e52b67e-7aae-4859-bdda-13aa6e4859a3" (UID: "7e52b67e-7aae-4859-bdda-13aa6e4859a3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.515264 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e52b67e-7aae-4859-bdda-13aa6e4859a3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.515305 4777 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e52b67e-7aae-4859-bdda-13aa6e4859a3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.515316 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj785\" (UniqueName: \"kubernetes.io/projected/7e52b67e-7aae-4859-bdda-13aa6e4859a3-kube-api-access-bj785\") on node \"crc\" DevicePath \"\"" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.838147 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" event={"ID":"7e52b67e-7aae-4859-bdda-13aa6e4859a3","Type":"ContainerDied","Data":"e07b6b4b08bff6df799dd694a55d2a078dc35b1e7a950cff55c4cb920d8ff3db"} Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.838206 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07b6b4b08bff6df799dd694a55d2a078dc35b1e7a950cff55c4cb920d8ff3db" Feb 16 22:15:03 crc kubenswrapper[4777]: I0216 22:15:03.838301 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521335-c4m58" Feb 16 22:15:04 crc kubenswrapper[4777]: I0216 22:15:04.378141 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml"] Feb 16 22:15:04 crc kubenswrapper[4777]: I0216 22:15:04.389123 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521290-h7lml"] Feb 16 22:15:06 crc kubenswrapper[4777]: I0216 22:15:06.204942 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eadd010-ae18-4453-8810-1f8edf434cb3" path="/var/lib/kubelet/pods/4eadd010-ae18-4453-8810-1f8edf434cb3/volumes" Feb 16 22:15:07 crc kubenswrapper[4777]: E0216 22:15:07.183989 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:11 crc kubenswrapper[4777]: I0216 22:15:11.652099 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:15:11 crc kubenswrapper[4777]: I0216 22:15:11.652787 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:15:18 crc kubenswrapper[4777]: E0216 22:15:18.184195 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:33 crc kubenswrapper[4777]: E0216 22:15:33.184434 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:41 crc kubenswrapper[4777]: I0216 22:15:41.651409 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:15:41 crc kubenswrapper[4777]: I0216 22:15:41.652085 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:15:44 crc kubenswrapper[4777]: E0216 22:15:44.186818 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.780013 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:15:49 crc kubenswrapper[4777]: E0216 22:15:49.781577 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e52b67e-7aae-4859-bdda-13aa6e4859a3" containerName="collect-profiles" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.781604 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e52b67e-7aae-4859-bdda-13aa6e4859a3" containerName="collect-profiles" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.782089 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e52b67e-7aae-4859-bdda-13aa6e4859a3" containerName="collect-profiles" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.785446 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.797185 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.937823 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmndm\" (UniqueName: \"kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.937891 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:49 crc kubenswrapper[4777]: I0216 22:15:49.938189 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.040480 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmndm\" (UniqueName: \"kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.040568 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.040750 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.041399 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.041502 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.070806 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmndm\" (UniqueName: \"kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm\") pod \"redhat-marketplace-d5z7k\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.133131 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:15:50 crc kubenswrapper[4777]: I0216 22:15:50.615224 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:15:51 crc kubenswrapper[4777]: I0216 22:15:51.094647 4777 generic.go:334] "Generic (PLEG): container finished" podID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerID="8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc" exitCode=0 Feb 16 22:15:51 crc kubenswrapper[4777]: I0216 22:15:51.094849 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerDied","Data":"8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc"} Feb 16 22:15:51 crc kubenswrapper[4777]: I0216 22:15:51.095110 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerStarted","Data":"983ad14ee32cd3c30266a21e94255cb3b564642b57b7aa769951c72bf099d82c"} Feb 16 22:15:51 crc kubenswrapper[4777]: I0216 22:15:51.097656 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.163125 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.167610 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.211378 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.289770 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkv85\" (UniqueName: \"kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.290061 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.290127 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.393061 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkv85\" (UniqueName: \"kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.393116 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.393160 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.393835 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.393928 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.420385 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkv85\" (UniqueName: \"kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85\") pod \"certified-operators-2w8j7\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.493202 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:15:52 crc kubenswrapper[4777]: I0216 22:15:52.989728 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:15:52 crc kubenswrapper[4777]: W0216 22:15:52.993977 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5052a8d_c789_47a9_9134_746dc7255c59.slice/crio-570339591c2576a29b7d4e96df42513328c1a2d69619947ed13a722976401d7e WatchSource:0}: Error finding container 570339591c2576a29b7d4e96df42513328c1a2d69619947ed13a722976401d7e: Status 404 returned error can't find the container with id 570339591c2576a29b7d4e96df42513328c1a2d69619947ed13a722976401d7e Feb 16 22:15:53 crc kubenswrapper[4777]: I0216 22:15:53.118848 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerStarted","Data":"570339591c2576a29b7d4e96df42513328c1a2d69619947ed13a722976401d7e"} Feb 16 22:15:53 crc kubenswrapper[4777]: I0216 22:15:53.121251 4777 generic.go:334] "Generic (PLEG): container finished" podID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerID="bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2" exitCode=0 Feb 16 22:15:53 crc kubenswrapper[4777]: I0216 22:15:53.121288 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerDied","Data":"bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2"} Feb 16 22:15:54 crc kubenswrapper[4777]: I0216 22:15:54.137117 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerStarted","Data":"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6"} Feb 16 22:15:54 crc kubenswrapper[4777]: I0216 22:15:54.142912 4777 generic.go:334] "Generic (PLEG): container finished" podID="d5052a8d-c789-47a9-9134-746dc7255c59" containerID="c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c" exitCode=0 Feb 16 22:15:54 crc kubenswrapper[4777]: I0216 22:15:54.142968 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerDied","Data":"c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c"} Feb 16 22:15:54 crc kubenswrapper[4777]: I0216 22:15:54.183552 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5z7k" podStartSLOduration=2.759847088 podStartE2EDuration="5.183522056s" podCreationTimestamp="2026-02-16 22:15:49 +0000 UTC" firstStartedPulling="2026-02-16 22:15:51.09745837 +0000 UTC m=+2271.679959472" lastFinishedPulling="2026-02-16 22:15:53.521133298 +0000 UTC m=+2274.103634440" observedRunningTime="2026-02-16 22:15:54.170915699 +0000 UTC m=+2274.753416821" watchObservedRunningTime="2026-02-16 22:15:54.183522056 +0000 UTC m=+2274.766023198" Feb 16 22:15:55 crc kubenswrapper[4777]: I0216 22:15:55.152991 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerStarted","Data":"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411"} Feb 16 22:15:56 crc kubenswrapper[4777]: E0216 22:15:56.190903 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:15:56 crc kubenswrapper[4777]: I0216 22:15:56.193267 4777 generic.go:334] "Generic (PLEG): container finished" podID="d5052a8d-c789-47a9-9134-746dc7255c59" containerID="a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411" exitCode=0 Feb 16 22:15:56 crc kubenswrapper[4777]: I0216 22:15:56.205144 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerDied","Data":"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411"} Feb 16 22:15:57 crc kubenswrapper[4777]: I0216 22:15:57.211763 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerStarted","Data":"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a"} Feb 16 22:15:57 crc kubenswrapper[4777]: I0216 22:15:57.245001 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2w8j7" podStartSLOduration=2.815134783 podStartE2EDuration="5.244976856s" podCreationTimestamp="2026-02-16 22:15:52 +0000 UTC" firstStartedPulling="2026-02-16 22:15:54.147027412 +0000 UTC m=+2274.729528554" lastFinishedPulling="2026-02-16 22:15:56.576869495 +0000 UTC m=+2277.159370627" observedRunningTime="2026-02-16 22:15:57.237268948 +0000 UTC m=+2277.819770080" watchObservedRunningTime="2026-02-16 22:15:57.244976856 +0000 UTC m=+2277.827477988" Feb 16 22:16:00 crc kubenswrapper[4777]: I0216 22:16:00.133634 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:00 crc kubenswrapper[4777]: I0216 22:16:00.134468 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:00 crc kubenswrapper[4777]: I0216 22:16:00.212084 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:00 crc kubenswrapper[4777]: I0216 22:16:00.324487 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:01 crc kubenswrapper[4777]: I0216 22:16:01.560710 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:16:02 crc kubenswrapper[4777]: I0216 22:16:02.280361 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5z7k" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="registry-server" containerID="cri-o://a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6" gracePeriod=2 Feb 16 22:16:02 crc kubenswrapper[4777]: I0216 22:16:02.494293 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:02 crc kubenswrapper[4777]: I0216 22:16:02.494358 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:02 crc kubenswrapper[4777]: I0216 22:16:02.570068 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:02 crc kubenswrapper[4777]: I0216 22:16:02.884514 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.003497 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content\") pod \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.003555 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmndm\" (UniqueName: \"kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm\") pod \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.003625 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities\") pod \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\" (UID: \"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3\") " Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.004672 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities" (OuterVolumeSpecName: "utilities") pod "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" (UID: "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.014001 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm" (OuterVolumeSpecName: "kube-api-access-gmndm") pod "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" (UID: "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3"). InnerVolumeSpecName "kube-api-access-gmndm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.025107 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" (UID: "a1c0bb0e-02a6-4ee2-8600-b394410ae4c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.107240 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmndm\" (UniqueName: \"kubernetes.io/projected/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-kube-api-access-gmndm\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.107306 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.107329 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.307584 4777 generic.go:334] "Generic (PLEG): container finished" podID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerID="a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6" exitCode=0 Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.309051 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5z7k" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.309987 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerDied","Data":"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6"} Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.310053 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5z7k" event={"ID":"a1c0bb0e-02a6-4ee2-8600-b394410ae4c3","Type":"ContainerDied","Data":"983ad14ee32cd3c30266a21e94255cb3b564642b57b7aa769951c72bf099d82c"} Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.310090 4777 scope.go:117] "RemoveContainer" containerID="a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.375419 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.384570 4777 scope.go:117] "RemoveContainer" containerID="bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.386097 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5z7k"] Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.415794 4777 scope.go:117] "RemoveContainer" containerID="8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.430846 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.484645 4777 scope.go:117] "RemoveContainer" containerID="a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6" Feb 16 22:16:03 crc kubenswrapper[4777]: E0216 22:16:03.485247 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6\": container with ID starting with a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6 not found: ID does not exist" containerID="a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.485298 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6"} err="failed to get container status \"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6\": rpc error: code = NotFound desc = could not find container \"a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6\": container with ID starting with a9465d1beae4210fce6de785baf7ec395127f25625b608ccef39e62859498af6 not found: ID does not exist" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.485333 4777 scope.go:117] "RemoveContainer" containerID="bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2" Feb 16 22:16:03 crc kubenswrapper[4777]: E0216 22:16:03.485811 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2\": container with ID starting with bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2 not found: ID does not exist" containerID="bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.485867 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2"} err="failed to get container status \"bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2\": rpc error: code = NotFound desc = could not find container \"bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2\": container with ID starting with bfb71840c7fd4e87811beb97bc8efb5029abfbad65c3c674070be056e810bae2 not found: ID does not exist" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.485904 4777 scope.go:117] "RemoveContainer" containerID="8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc" Feb 16 22:16:03 crc kubenswrapper[4777]: E0216 22:16:03.486287 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc\": container with ID starting with 8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc not found: ID does not exist" containerID="8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc" Feb 16 22:16:03 crc kubenswrapper[4777]: I0216 22:16:03.486348 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc"} err="failed to get container status \"8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc\": rpc error: code = NotFound desc = could not find container \"8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc\": container with ID starting with 8748c02c55dc191c6f5d587b9cc35d43344d21e6f24eddd1bfb47e1593a3b7bc not found: ID does not exist" Feb 16 22:16:04 crc kubenswrapper[4777]: I0216 22:16:04.212340 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" path="/var/lib/kubelet/pods/a1c0bb0e-02a6-4ee2-8600-b394410ae4c3/volumes" Feb 16 22:16:04 crc kubenswrapper[4777]: I0216 22:16:04.950213 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:16:04 crc kubenswrapper[4777]: I0216 22:16:04.981327 4777 scope.go:117] "RemoveContainer" containerID="b652e68a557e6bf036b7d7c10e707aaeac80c326803214742d609c3164d2a186" Feb 16 22:16:05 crc kubenswrapper[4777]: I0216 22:16:05.331792 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2w8j7" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="registry-server" containerID="cri-o://986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a" gracePeriod=2 Feb 16 22:16:05 crc kubenswrapper[4777]: I0216 22:16:05.934549 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.052773 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities\") pod \"d5052a8d-c789-47a9-9134-746dc7255c59\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.052856 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content\") pod \"d5052a8d-c789-47a9-9134-746dc7255c59\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.052956 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkv85\" (UniqueName: \"kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85\") pod \"d5052a8d-c789-47a9-9134-746dc7255c59\" (UID: \"d5052a8d-c789-47a9-9134-746dc7255c59\") " Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.055092 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities" (OuterVolumeSpecName: "utilities") pod "d5052a8d-c789-47a9-9134-746dc7255c59" (UID: "d5052a8d-c789-47a9-9134-746dc7255c59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.060366 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85" (OuterVolumeSpecName: "kube-api-access-zkv85") pod "d5052a8d-c789-47a9-9134-746dc7255c59" (UID: "d5052a8d-c789-47a9-9134-746dc7255c59"). InnerVolumeSpecName "kube-api-access-zkv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.155644 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.155678 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkv85\" (UniqueName: \"kubernetes.io/projected/d5052a8d-c789-47a9-9134-746dc7255c59-kube-api-access-zkv85\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.346896 4777 generic.go:334] "Generic (PLEG): container finished" podID="d5052a8d-c789-47a9-9134-746dc7255c59" containerID="986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a" exitCode=0 Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.347004 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerDied","Data":"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a"} Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.347044 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2w8j7" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.347073 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2w8j7" event={"ID":"d5052a8d-c789-47a9-9134-746dc7255c59","Type":"ContainerDied","Data":"570339591c2576a29b7d4e96df42513328c1a2d69619947ed13a722976401d7e"} Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.347132 4777 scope.go:117] "RemoveContainer" containerID="986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.403404 4777 scope.go:117] "RemoveContainer" containerID="a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.438557 4777 scope.go:117] "RemoveContainer" containerID="c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.517159 4777 scope.go:117] "RemoveContainer" containerID="986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a" Feb 16 22:16:06 crc kubenswrapper[4777]: E0216 22:16:06.518272 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a\": container with ID starting with 986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a not found: ID does not exist" containerID="986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.518329 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a"} err="failed to get container status \"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a\": rpc error: code = NotFound desc = could not find container \"986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a\": container with ID starting with 986b494e7a29e1b48d1dd2344f92be81cfafab87d54a122c1a090749659d0e5a not found: ID does not exist" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.518364 4777 scope.go:117] "RemoveContainer" containerID="a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411" Feb 16 22:16:06 crc kubenswrapper[4777]: E0216 22:16:06.518912 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411\": container with ID starting with a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411 not found: ID does not exist" containerID="a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.519080 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411"} err="failed to get container status \"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411\": rpc error: code = NotFound desc = could not find container \"a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411\": container with ID starting with a6f3278e086d2ecbcab02c4e8894bd18b9225fbdab77e2fe025039c828b24411 not found: ID does not exist" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.519240 4777 scope.go:117] "RemoveContainer" containerID="c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c" Feb 16 22:16:06 crc kubenswrapper[4777]: E0216 22:16:06.519998 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c\": container with ID starting with c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c not found: ID does not exist" containerID="c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.520038 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c"} err="failed to get container status \"c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c\": rpc error: code = NotFound desc = could not find container \"c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c\": container with ID starting with c93cdc026336ecdac887abf12a237f84a34afefbe1f3cf1df0dd50e52632ae7c not found: ID does not exist" Feb 16 22:16:06 crc kubenswrapper[4777]: I0216 22:16:06.976180 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5052a8d-c789-47a9-9134-746dc7255c59" (UID: "d5052a8d-c789-47a9-9134-746dc7255c59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:16:07 crc kubenswrapper[4777]: I0216 22:16:07.077028 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5052a8d-c789-47a9-9134-746dc7255c59-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:16:07 crc kubenswrapper[4777]: I0216 22:16:07.308895 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:16:07 crc kubenswrapper[4777]: I0216 22:16:07.326317 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2w8j7"] Feb 16 22:16:08 crc kubenswrapper[4777]: E0216 22:16:08.184585 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:16:08 crc kubenswrapper[4777]: I0216 22:16:08.203239 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" path="/var/lib/kubelet/pods/d5052a8d-c789-47a9-9134-746dc7255c59/volumes" Feb 16 22:16:11 crc kubenswrapper[4777]: I0216 22:16:11.651760 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:16:11 crc kubenswrapper[4777]: I0216 22:16:11.652253 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:16:11 crc kubenswrapper[4777]: I0216 22:16:11.652304 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:16:11 crc kubenswrapper[4777]: I0216 22:16:11.653062 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:16:11 crc kubenswrapper[4777]: I0216 22:16:11.653114 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" gracePeriod=600 Feb 16 22:16:11 crc kubenswrapper[4777]: E0216 22:16:11.793181 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:16:12 crc kubenswrapper[4777]: I0216 22:16:12.460892 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" exitCode=0 Feb 16 22:16:12 crc kubenswrapper[4777]: I0216 22:16:12.460980 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0"} Feb 16 22:16:12 crc kubenswrapper[4777]: I0216 22:16:12.461061 4777 scope.go:117] "RemoveContainer" containerID="62eb03a55fe17d4c32ab25ce27d34307e3c49a4f99a2d2f05c0cefe5f385fa4e" Feb 16 22:16:12 crc kubenswrapper[4777]: I0216 22:16:12.461743 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:16:12 crc kubenswrapper[4777]: E0216 22:16:12.462060 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:16:21 crc kubenswrapper[4777]: E0216 22:16:21.186340 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:16:24 crc kubenswrapper[4777]: I0216 22:16:24.183244 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:16:24 crc kubenswrapper[4777]: E0216 22:16:24.184190 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:16:32 crc kubenswrapper[4777]: E0216 22:16:32.184920 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:16:36 crc kubenswrapper[4777]: I0216 22:16:36.182891 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:16:36 crc kubenswrapper[4777]: E0216 22:16:36.183999 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:16:44 crc kubenswrapper[4777]: E0216 22:16:44.184787 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:16:51 crc kubenswrapper[4777]: I0216 22:16:51.181848 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:16:51 crc kubenswrapper[4777]: E0216 22:16:51.182813 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:16:58 crc kubenswrapper[4777]: E0216 22:16:58.186702 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:17:03 crc kubenswrapper[4777]: I0216 22:17:03.182461 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:17:03 crc kubenswrapper[4777]: E0216 22:17:03.183831 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:17:10 crc kubenswrapper[4777]: E0216 22:17:10.208020 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:17:15 crc kubenswrapper[4777]: I0216 22:17:15.182378 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:17:15 crc kubenswrapper[4777]: E0216 22:17:15.184168 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:17:23 crc kubenswrapper[4777]: E0216 22:17:23.185486 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:17:26 crc kubenswrapper[4777]: I0216 22:17:26.193330 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:17:26 crc kubenswrapper[4777]: E0216 22:17:26.194231 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:17:38 crc kubenswrapper[4777]: E0216 22:17:38.185080 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:17:39 crc kubenswrapper[4777]: I0216 22:17:39.182547 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:17:39 crc kubenswrapper[4777]: E0216 22:17:39.183257 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:17:52 crc kubenswrapper[4777]: I0216 22:17:52.181891 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:17:52 crc kubenswrapper[4777]: E0216 22:17:52.182513 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:17:53 crc kubenswrapper[4777]: E0216 22:17:53.185013 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:18:07 crc kubenswrapper[4777]: I0216 22:18:07.182250 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:18:07 crc kubenswrapper[4777]: E0216 22:18:07.183207 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:18:08 crc kubenswrapper[4777]: E0216 22:18:08.183733 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:18:22 crc kubenswrapper[4777]: I0216 22:18:22.182609 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:18:22 crc kubenswrapper[4777]: E0216 22:18:22.183734 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:18:23 crc kubenswrapper[4777]: E0216 22:18:23.184555 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:18:33 crc kubenswrapper[4777]: I0216 22:18:33.182130 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:18:33 crc kubenswrapper[4777]: E0216 22:18:33.183161 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:18:35 crc kubenswrapper[4777]: E0216 22:18:35.183230 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:18:48 crc kubenswrapper[4777]: I0216 22:18:48.182339 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:18:48 crc kubenswrapper[4777]: E0216 22:18:48.184071 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:18:48 crc kubenswrapper[4777]: E0216 22:18:48.186488 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:19:01 crc kubenswrapper[4777]: I0216 22:19:01.182499 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:19:01 crc kubenswrapper[4777]: E0216 22:19:01.183583 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:19:02 crc kubenswrapper[4777]: E0216 22:19:02.186172 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:19:16 crc kubenswrapper[4777]: I0216 22:19:16.183313 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:19:16 crc kubenswrapper[4777]: E0216 22:19:16.184776 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:19:16 crc kubenswrapper[4777]: E0216 22:19:16.185768 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:19:29 crc kubenswrapper[4777]: E0216 22:19:29.327176 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:19:29 crc kubenswrapper[4777]: E0216 22:19:29.327986 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:19:29 crc kubenswrapper[4777]: E0216 22:19:29.328213 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:19:29 crc kubenswrapper[4777]: E0216 22:19:29.329483 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:19:30 crc kubenswrapper[4777]: I0216 22:19:30.193339 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:19:30 crc kubenswrapper[4777]: E0216 22:19:30.193858 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:19:41 crc kubenswrapper[4777]: I0216 22:19:41.182987 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:19:41 crc kubenswrapper[4777]: E0216 22:19:41.184393 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:19:42 crc kubenswrapper[4777]: E0216 22:19:42.187305 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:19:55 crc kubenswrapper[4777]: I0216 22:19:55.181798 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:19:55 crc kubenswrapper[4777]: E0216 22:19:55.182814 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:19:56 crc kubenswrapper[4777]: E0216 22:19:56.189705 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:20:07 crc kubenswrapper[4777]: I0216 22:20:07.183323 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:20:07 crc kubenswrapper[4777]: E0216 22:20:07.184842 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:20:10 crc kubenswrapper[4777]: E0216 22:20:10.204983 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:20:22 crc kubenswrapper[4777]: I0216 22:20:22.182912 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:20:22 crc kubenswrapper[4777]: E0216 22:20:22.184281 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:20:23 crc kubenswrapper[4777]: E0216 22:20:23.510843 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:20:35 crc kubenswrapper[4777]: I0216 22:20:35.182284 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:20:35 crc kubenswrapper[4777]: E0216 22:20:35.183011 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:20:39 crc kubenswrapper[4777]: E0216 22:20:39.184594 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:20:46 crc kubenswrapper[4777]: I0216 22:20:46.182558 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:20:46 crc kubenswrapper[4777]: E0216 22:20:46.183633 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:20:53 crc kubenswrapper[4777]: E0216 22:20:53.185445 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:20:58 crc kubenswrapper[4777]: I0216 22:20:58.181932 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:20:58 crc kubenswrapper[4777]: E0216 22:20:58.182614 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:21:05 crc kubenswrapper[4777]: E0216 22:21:05.187350 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.182493 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.952817 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953592 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="extract-content" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953608 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="extract-content" Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953631 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953639 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953671 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="extract-utilities" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953679 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="extract-utilities" Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953732 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953741 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953757 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="extract-content" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953764 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="extract-content" Feb 16 22:21:12 crc kubenswrapper[4777]: E0216 22:21:12.953775 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="extract-utilities" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.953783 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="extract-utilities" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.954018 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c0bb0e-02a6-4ee2-8600-b394410ae4c3" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.954033 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5052a8d-c789-47a9-9134-746dc7255c59" containerName="registry-server" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.955814 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:12 crc kubenswrapper[4777]: I0216 22:21:12.968262 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.042523 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.042673 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpsf4\" (UniqueName: \"kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.042704 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.058510 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9"} Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.144903 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpsf4\" (UniqueName: \"kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.144973 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.145055 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.145665 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.147104 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.172503 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpsf4\" (UniqueName: \"kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4\") pod \"community-operators-dqhmz\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.295831 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:13 crc kubenswrapper[4777]: I0216 22:21:13.858402 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:13 crc kubenswrapper[4777]: W0216 22:21:13.861193 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ab0754_f8df_4f49_953d_e678173dfdec.slice/crio-df7184ff9cd7068b14ff69ca76697a5370fbd050ad4d8730037bd30250853250 WatchSource:0}: Error finding container df7184ff9cd7068b14ff69ca76697a5370fbd050ad4d8730037bd30250853250: Status 404 returned error can't find the container with id df7184ff9cd7068b14ff69ca76697a5370fbd050ad4d8730037bd30250853250 Feb 16 22:21:14 crc kubenswrapper[4777]: I0216 22:21:14.076354 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerStarted","Data":"df7184ff9cd7068b14ff69ca76697a5370fbd050ad4d8730037bd30250853250"} Feb 16 22:21:15 crc kubenswrapper[4777]: I0216 22:21:15.090660 4777 generic.go:334] "Generic (PLEG): container finished" podID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerID="624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d" exitCode=0 Feb 16 22:21:15 crc kubenswrapper[4777]: I0216 22:21:15.090765 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerDied","Data":"624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d"} Feb 16 22:21:15 crc kubenswrapper[4777]: I0216 22:21:15.094264 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:21:16 crc kubenswrapper[4777]: I0216 22:21:16.102509 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerStarted","Data":"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0"} Feb 16 22:21:17 crc kubenswrapper[4777]: I0216 22:21:17.118363 4777 generic.go:334] "Generic (PLEG): container finished" podID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerID="936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0" exitCode=0 Feb 16 22:21:17 crc kubenswrapper[4777]: I0216 22:21:17.118499 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerDied","Data":"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0"} Feb 16 22:21:18 crc kubenswrapper[4777]: I0216 22:21:18.132027 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerStarted","Data":"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407"} Feb 16 22:21:18 crc kubenswrapper[4777]: I0216 22:21:18.171657 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dqhmz" podStartSLOduration=3.7219184419999998 podStartE2EDuration="6.171636099s" podCreationTimestamp="2026-02-16 22:21:12 +0000 UTC" firstStartedPulling="2026-02-16 22:21:15.093884227 +0000 UTC m=+2595.676385369" lastFinishedPulling="2026-02-16 22:21:17.543601884 +0000 UTC m=+2598.126103026" observedRunningTime="2026-02-16 22:21:18.160859126 +0000 UTC m=+2598.743360238" watchObservedRunningTime="2026-02-16 22:21:18.171636099 +0000 UTC m=+2598.754137221" Feb 16 22:21:19 crc kubenswrapper[4777]: E0216 22:21:19.184229 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:21:23 crc kubenswrapper[4777]: I0216 22:21:23.296338 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:23 crc kubenswrapper[4777]: I0216 22:21:23.296914 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:23 crc kubenswrapper[4777]: I0216 22:21:23.373428 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:24 crc kubenswrapper[4777]: I0216 22:21:24.294758 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:24 crc kubenswrapper[4777]: I0216 22:21:24.367666 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.237298 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dqhmz" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="registry-server" containerID="cri-o://4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407" gracePeriod=2 Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.833816 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.882816 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities\") pod \"b1ab0754-f8df-4f49-953d-e678173dfdec\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.883049 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content\") pod \"b1ab0754-f8df-4f49-953d-e678173dfdec\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.883230 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpsf4\" (UniqueName: \"kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4\") pod \"b1ab0754-f8df-4f49-953d-e678173dfdec\" (UID: \"b1ab0754-f8df-4f49-953d-e678173dfdec\") " Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.885384 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities" (OuterVolumeSpecName: "utilities") pod "b1ab0754-f8df-4f49-953d-e678173dfdec" (UID: "b1ab0754-f8df-4f49-953d-e678173dfdec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.898984 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4" (OuterVolumeSpecName: "kube-api-access-kpsf4") pod "b1ab0754-f8df-4f49-953d-e678173dfdec" (UID: "b1ab0754-f8df-4f49-953d-e678173dfdec"). InnerVolumeSpecName "kube-api-access-kpsf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.986558 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpsf4\" (UniqueName: \"kubernetes.io/projected/b1ab0754-f8df-4f49-953d-e678173dfdec-kube-api-access-kpsf4\") on node \"crc\" DevicePath \"\"" Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.986595 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:21:26 crc kubenswrapper[4777]: I0216 22:21:26.986982 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1ab0754-f8df-4f49-953d-e678173dfdec" (UID: "b1ab0754-f8df-4f49-953d-e678173dfdec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.089576 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab0754-f8df-4f49-953d-e678173dfdec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.250530 4777 generic.go:334] "Generic (PLEG): container finished" podID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerID="4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407" exitCode=0 Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.250580 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerDied","Data":"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407"} Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.250618 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dqhmz" event={"ID":"b1ab0754-f8df-4f49-953d-e678173dfdec","Type":"ContainerDied","Data":"df7184ff9cd7068b14ff69ca76697a5370fbd050ad4d8730037bd30250853250"} Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.250622 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dqhmz" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.250640 4777 scope.go:117] "RemoveContainer" containerID="4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.297579 4777 scope.go:117] "RemoveContainer" containerID="936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.313273 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.324526 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dqhmz"] Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.333232 4777 scope.go:117] "RemoveContainer" containerID="624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.381201 4777 scope.go:117] "RemoveContainer" containerID="4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407" Feb 16 22:21:27 crc kubenswrapper[4777]: E0216 22:21:27.381707 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407\": container with ID starting with 4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407 not found: ID does not exist" containerID="4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.381808 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407"} err="failed to get container status \"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407\": rpc error: code = NotFound desc = could not find container \"4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407\": container with ID starting with 4ac1968dcdba63f4bc07adeb8a48c09cf785cc1eb74e6414e16278cd80cb2407 not found: ID does not exist" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.381831 4777 scope.go:117] "RemoveContainer" containerID="936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0" Feb 16 22:21:27 crc kubenswrapper[4777]: E0216 22:21:27.382092 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0\": container with ID starting with 936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0 not found: ID does not exist" containerID="936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.382130 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0"} err="failed to get container status \"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0\": rpc error: code = NotFound desc = could not find container \"936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0\": container with ID starting with 936f50a71ad3ea5e419241e4c0290f3d00e7c5d544ac358d0ccce2dc852974e0 not found: ID does not exist" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.382154 4777 scope.go:117] "RemoveContainer" containerID="624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d" Feb 16 22:21:27 crc kubenswrapper[4777]: E0216 22:21:27.382460 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d\": container with ID starting with 624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d not found: ID does not exist" containerID="624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d" Feb 16 22:21:27 crc kubenswrapper[4777]: I0216 22:21:27.382517 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d"} err="failed to get container status \"624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d\": rpc error: code = NotFound desc = could not find container \"624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d\": container with ID starting with 624072613e6b7f8018b228860c4cf8defcab907a6b9f65b920a53e33227f988d not found: ID does not exist" Feb 16 22:21:28 crc kubenswrapper[4777]: I0216 22:21:28.199398 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" path="/var/lib/kubelet/pods/b1ab0754-f8df-4f49-953d-e678173dfdec/volumes" Feb 16 22:21:31 crc kubenswrapper[4777]: E0216 22:21:31.185334 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:21:42 crc kubenswrapper[4777]: E0216 22:21:42.187531 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:21:53 crc kubenswrapper[4777]: E0216 22:21:53.185467 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:08 crc kubenswrapper[4777]: E0216 22:22:08.185558 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:19 crc kubenswrapper[4777]: E0216 22:22:19.184946 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:30 crc kubenswrapper[4777]: E0216 22:22:30.194952 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:45 crc kubenswrapper[4777]: E0216 22:22:45.185062 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.883745 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:22:45 crc kubenswrapper[4777]: E0216 22:22:45.884402 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="extract-utilities" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.884440 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="extract-utilities" Feb 16 22:22:45 crc kubenswrapper[4777]: E0216 22:22:45.884474 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="extract-content" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.884487 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="extract-content" Feb 16 22:22:45 crc kubenswrapper[4777]: E0216 22:22:45.884542 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="registry-server" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.884555 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="registry-server" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.885008 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ab0754-f8df-4f49-953d-e678173dfdec" containerName="registry-server" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.887636 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:45 crc kubenswrapper[4777]: I0216 22:22:45.903365 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.010173 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzlws\" (UniqueName: \"kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.010308 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.010484 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.113061 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzlws\" (UniqueName: \"kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.113130 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.113185 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.113570 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.113953 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.145510 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzlws\" (UniqueName: \"kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws\") pod \"redhat-operators-bkfzf\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.235345 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:46 crc kubenswrapper[4777]: I0216 22:22:46.787463 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:22:47 crc kubenswrapper[4777]: I0216 22:22:47.221903 4777 generic.go:334] "Generic (PLEG): container finished" podID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerID="c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4" exitCode=0 Feb 16 22:22:47 crc kubenswrapper[4777]: I0216 22:22:47.222137 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerDied","Data":"c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4"} Feb 16 22:22:47 crc kubenswrapper[4777]: I0216 22:22:47.222203 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerStarted","Data":"a24f595ccf4c2c700693a9ef8ca76a9b2e4a6a66f2be135037a31283282cbb21"} Feb 16 22:22:48 crc kubenswrapper[4777]: I0216 22:22:48.235453 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerStarted","Data":"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88"} Feb 16 22:22:49 crc kubenswrapper[4777]: I0216 22:22:49.251628 4777 generic.go:334] "Generic (PLEG): container finished" podID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerID="b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88" exitCode=0 Feb 16 22:22:49 crc kubenswrapper[4777]: I0216 22:22:49.251693 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerDied","Data":"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88"} Feb 16 22:22:50 crc kubenswrapper[4777]: I0216 22:22:50.272697 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerStarted","Data":"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b"} Feb 16 22:22:50 crc kubenswrapper[4777]: I0216 22:22:50.301787 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bkfzf" podStartSLOduration=2.814687286 podStartE2EDuration="5.301767982s" podCreationTimestamp="2026-02-16 22:22:45 +0000 UTC" firstStartedPulling="2026-02-16 22:22:47.223722271 +0000 UTC m=+2687.806223453" lastFinishedPulling="2026-02-16 22:22:49.710803017 +0000 UTC m=+2690.293304149" observedRunningTime="2026-02-16 22:22:50.299094597 +0000 UTC m=+2690.881595709" watchObservedRunningTime="2026-02-16 22:22:50.301767982 +0000 UTC m=+2690.884269094" Feb 16 22:22:56 crc kubenswrapper[4777]: I0216 22:22:56.235891 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:56 crc kubenswrapper[4777]: I0216 22:22:56.237894 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:22:57 crc kubenswrapper[4777]: E0216 22:22:57.186169 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:22:57 crc kubenswrapper[4777]: I0216 22:22:57.317760 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bkfzf" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="registry-server" probeResult="failure" output=< Feb 16 22:22:57 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 22:22:57 crc kubenswrapper[4777]: > Feb 16 22:23:06 crc kubenswrapper[4777]: I0216 22:23:06.312803 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:23:06 crc kubenswrapper[4777]: I0216 22:23:06.384737 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:23:06 crc kubenswrapper[4777]: I0216 22:23:06.560248 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:23:07 crc kubenswrapper[4777]: I0216 22:23:07.469339 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bkfzf" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="registry-server" containerID="cri-o://b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b" gracePeriod=2 Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.018631 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.146788 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzlws\" (UniqueName: \"kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws\") pod \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.146898 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content\") pod \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.146930 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities\") pod \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\" (UID: \"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa\") " Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.148227 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities" (OuterVolumeSpecName: "utilities") pod "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" (UID: "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.153571 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws" (OuterVolumeSpecName: "kube-api-access-mzlws") pod "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" (UID: "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa"). InnerVolumeSpecName "kube-api-access-mzlws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.249276 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzlws\" (UniqueName: \"kubernetes.io/projected/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-kube-api-access-mzlws\") on node \"crc\" DevicePath \"\"" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.249315 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.277024 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" (UID: "ecf3389e-1d3f-4422-a11b-e8ee56deeeaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.351477 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.483053 4777 generic.go:334] "Generic (PLEG): container finished" podID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerID="b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b" exitCode=0 Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.483099 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerDied","Data":"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b"} Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.483131 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkfzf" event={"ID":"ecf3389e-1d3f-4422-a11b-e8ee56deeeaa","Type":"ContainerDied","Data":"a24f595ccf4c2c700693a9ef8ca76a9b2e4a6a66f2be135037a31283282cbb21"} Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.483149 4777 scope.go:117] "RemoveContainer" containerID="b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.483294 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkfzf" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.518301 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.528638 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bkfzf"] Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.529900 4777 scope.go:117] "RemoveContainer" containerID="b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.553456 4777 scope.go:117] "RemoveContainer" containerID="c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.623357 4777 scope.go:117] "RemoveContainer" containerID="b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b" Feb 16 22:23:08 crc kubenswrapper[4777]: E0216 22:23:08.623844 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b\": container with ID starting with b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b not found: ID does not exist" containerID="b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.623874 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b"} err="failed to get container status \"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b\": rpc error: code = NotFound desc = could not find container \"b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b\": container with ID starting with b99ab05a7537452871132655742b1121ca95492bb41b87bc3eeb62ac08fea17b not found: ID does not exist" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.623896 4777 scope.go:117] "RemoveContainer" containerID="b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88" Feb 16 22:23:08 crc kubenswrapper[4777]: E0216 22:23:08.624216 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88\": container with ID starting with b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88 not found: ID does not exist" containerID="b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.624241 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88"} err="failed to get container status \"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88\": rpc error: code = NotFound desc = could not find container \"b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88\": container with ID starting with b6630ab0ce2835790094e68db6df3bd7293291252995481e0b9d720870fbdd88 not found: ID does not exist" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.624258 4777 scope.go:117] "RemoveContainer" containerID="c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4" Feb 16 22:23:08 crc kubenswrapper[4777]: E0216 22:23:08.624684 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4\": container with ID starting with c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4 not found: ID does not exist" containerID="c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4" Feb 16 22:23:08 crc kubenswrapper[4777]: I0216 22:23:08.624840 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4"} err="failed to get container status \"c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4\": rpc error: code = NotFound desc = could not find container \"c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4\": container with ID starting with c79b2bcbef739c063ca0946c883f221d50374dec46f837533784c47e19cc5eb4 not found: ID does not exist" Feb 16 22:23:09 crc kubenswrapper[4777]: E0216 22:23:09.184559 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:23:10 crc kubenswrapper[4777]: I0216 22:23:10.201601 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" path="/var/lib/kubelet/pods/ecf3389e-1d3f-4422-a11b-e8ee56deeeaa/volumes" Feb 16 22:23:23 crc kubenswrapper[4777]: E0216 22:23:23.185135 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:23:34 crc kubenswrapper[4777]: E0216 22:23:34.183935 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:23:41 crc kubenswrapper[4777]: I0216 22:23:41.651839 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:23:41 crc kubenswrapper[4777]: I0216 22:23:41.652200 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:23:45 crc kubenswrapper[4777]: E0216 22:23:45.184522 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:23:58 crc kubenswrapper[4777]: E0216 22:23:58.186309 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:24:11 crc kubenswrapper[4777]: E0216 22:24:11.185656 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:24:11 crc kubenswrapper[4777]: I0216 22:24:11.651982 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:24:11 crc kubenswrapper[4777]: I0216 22:24:11.652041 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:24:25 crc kubenswrapper[4777]: E0216 22:24:25.185150 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:24:36 crc kubenswrapper[4777]: E0216 22:24:36.298066 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:24:36 crc kubenswrapper[4777]: E0216 22:24:36.298736 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:24:36 crc kubenswrapper[4777]: E0216 22:24:36.298991 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:24:36 crc kubenswrapper[4777]: E0216 22:24:36.300556 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:24:41 crc kubenswrapper[4777]: I0216 22:24:41.652159 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:24:41 crc kubenswrapper[4777]: I0216 22:24:41.653161 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:24:41 crc kubenswrapper[4777]: I0216 22:24:41.653354 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:24:41 crc kubenswrapper[4777]: I0216 22:24:41.655337 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:24:41 crc kubenswrapper[4777]: I0216 22:24:41.655446 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9" gracePeriod=600 Feb 16 22:24:42 crc kubenswrapper[4777]: I0216 22:24:42.588939 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9"} Feb 16 22:24:42 crc kubenswrapper[4777]: I0216 22:24:42.589443 4777 scope.go:117] "RemoveContainer" containerID="80523540a2ea3a9c147ca37b76164899e0f6f7a969a28c015e640d9eafabf0c0" Feb 16 22:24:42 crc kubenswrapper[4777]: I0216 22:24:42.588883 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9" exitCode=0 Feb 16 22:24:42 crc kubenswrapper[4777]: I0216 22:24:42.589690 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7"} Feb 16 22:24:49 crc kubenswrapper[4777]: E0216 22:24:49.184665 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:25:03 crc kubenswrapper[4777]: E0216 22:25:03.184558 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:25:16 crc kubenswrapper[4777]: E0216 22:25:16.184645 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:25:27 crc kubenswrapper[4777]: E0216 22:25:27.186300 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:25:40 crc kubenswrapper[4777]: E0216 22:25:40.199344 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:25:55 crc kubenswrapper[4777]: E0216 22:25:55.185281 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:26:08 crc kubenswrapper[4777]: E0216 22:26:08.186426 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:26:20 crc kubenswrapper[4777]: E0216 22:26:20.188240 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:26:31 crc kubenswrapper[4777]: E0216 22:26:31.185419 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:26:41 crc kubenswrapper[4777]: I0216 22:26:41.652143 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:26:41 crc kubenswrapper[4777]: I0216 22:26:41.652841 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:26:44 crc kubenswrapper[4777]: E0216 22:26:44.187814 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:26:55 crc kubenswrapper[4777]: E0216 22:26:55.185433 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:27:07 crc kubenswrapper[4777]: E0216 22:27:07.186676 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:27:11 crc kubenswrapper[4777]: I0216 22:27:11.651386 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:27:11 crc kubenswrapper[4777]: I0216 22:27:11.651957 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.018100 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:12 crc kubenswrapper[4777]: E0216 22:27:12.018932 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="extract-utilities" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.018953 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="extract-utilities" Feb 16 22:27:12 crc kubenswrapper[4777]: E0216 22:27:12.018968 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="registry-server" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.018977 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="registry-server" Feb 16 22:27:12 crc kubenswrapper[4777]: E0216 22:27:12.018991 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="extract-content" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.019000 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="extract-content" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.019261 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf3389e-1d3f-4422-a11b-e8ee56deeeaa" containerName="registry-server" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.021080 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.031454 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.101379 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.101628 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.102097 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznjl\" (UniqueName: \"kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.204844 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznjl\" (UniqueName: \"kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.205199 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.205360 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.205926 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.205968 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.231631 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznjl\" (UniqueName: \"kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl\") pod \"certified-operators-htc2k\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.361242 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:12 crc kubenswrapper[4777]: I0216 22:27:12.877602 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:13 crc kubenswrapper[4777]: I0216 22:27:13.411165 4777 generic.go:334] "Generic (PLEG): container finished" podID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerID="9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62" exitCode=0 Feb 16 22:27:13 crc kubenswrapper[4777]: I0216 22:27:13.411297 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerDied","Data":"9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62"} Feb 16 22:27:13 crc kubenswrapper[4777]: I0216 22:27:13.411924 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerStarted","Data":"60c020aac55dcd4ebf386758403a8c0ef858ab8bf6769dc3a27fcc6ed92da759"} Feb 16 22:27:13 crc kubenswrapper[4777]: I0216 22:27:13.414560 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.212478 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.216954 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.236087 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.263816 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.263886 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.264004 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6sq\" (UniqueName: \"kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.365430 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6sq\" (UniqueName: \"kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.365672 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.366228 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.366272 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.366313 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.389641 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6sq\" (UniqueName: \"kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq\") pod \"redhat-marketplace-66tm6\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.436379 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerStarted","Data":"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910"} Feb 16 22:27:14 crc kubenswrapper[4777]: I0216 22:27:14.630745 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.167277 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:15 crc kubenswrapper[4777]: W0216 22:27:15.173955 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae5d7d5_f64c_4a9e_81fb_60664123e692.slice/crio-e93dacfc8126897c406b1600326ab7aaef78bda15e20a4298873a67d8d08aeef WatchSource:0}: Error finding container e93dacfc8126897c406b1600326ab7aaef78bda15e20a4298873a67d8d08aeef: Status 404 returned error can't find the container with id e93dacfc8126897c406b1600326ab7aaef78bda15e20a4298873a67d8d08aeef Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.450999 4777 generic.go:334] "Generic (PLEG): container finished" podID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerID="1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910" exitCode=0 Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.451100 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerDied","Data":"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910"} Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.457017 4777 generic.go:334] "Generic (PLEG): container finished" podID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerID="d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87" exitCode=0 Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.457062 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerDied","Data":"d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87"} Feb 16 22:27:15 crc kubenswrapper[4777]: I0216 22:27:15.457092 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerStarted","Data":"e93dacfc8126897c406b1600326ab7aaef78bda15e20a4298873a67d8d08aeef"} Feb 16 22:27:16 crc kubenswrapper[4777]: I0216 22:27:16.470138 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerStarted","Data":"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f"} Feb 16 22:27:16 crc kubenswrapper[4777]: I0216 22:27:16.476339 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerStarted","Data":"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033"} Feb 16 22:27:16 crc kubenswrapper[4777]: I0216 22:27:16.526932 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-htc2k" podStartSLOduration=2.804509449 podStartE2EDuration="5.526908186s" podCreationTimestamp="2026-02-16 22:27:11 +0000 UTC" firstStartedPulling="2026-02-16 22:27:13.414160135 +0000 UTC m=+2953.996661277" lastFinishedPulling="2026-02-16 22:27:16.136558872 +0000 UTC m=+2956.719060014" observedRunningTime="2026-02-16 22:27:16.521517925 +0000 UTC m=+2957.104019067" watchObservedRunningTime="2026-02-16 22:27:16.526908186 +0000 UTC m=+2957.109409318" Feb 16 22:27:17 crc kubenswrapper[4777]: I0216 22:27:17.490552 4777 generic.go:334] "Generic (PLEG): container finished" podID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerID="8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f" exitCode=0 Feb 16 22:27:17 crc kubenswrapper[4777]: I0216 22:27:17.490615 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerDied","Data":"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f"} Feb 16 22:27:18 crc kubenswrapper[4777]: I0216 22:27:18.503435 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerStarted","Data":"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd"} Feb 16 22:27:18 crc kubenswrapper[4777]: I0216 22:27:18.528594 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66tm6" podStartSLOduration=2.104395249 podStartE2EDuration="4.528568477s" podCreationTimestamp="2026-02-16 22:27:14 +0000 UTC" firstStartedPulling="2026-02-16 22:27:15.459175893 +0000 UTC m=+2956.041676995" lastFinishedPulling="2026-02-16 22:27:17.883349081 +0000 UTC m=+2958.465850223" observedRunningTime="2026-02-16 22:27:18.526958752 +0000 UTC m=+2959.109459864" watchObservedRunningTime="2026-02-16 22:27:18.528568477 +0000 UTC m=+2959.111069619" Feb 16 22:27:21 crc kubenswrapper[4777]: E0216 22:27:21.185762 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:27:22 crc kubenswrapper[4777]: I0216 22:27:22.362301 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:22 crc kubenswrapper[4777]: I0216 22:27:22.362687 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:22 crc kubenswrapper[4777]: I0216 22:27:22.446678 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:22 crc kubenswrapper[4777]: I0216 22:27:22.633193 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:24 crc kubenswrapper[4777]: I0216 22:27:24.589053 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:24 crc kubenswrapper[4777]: I0216 22:27:24.590010 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-htc2k" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="registry-server" containerID="cri-o://f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033" gracePeriod=2 Feb 16 22:27:24 crc kubenswrapper[4777]: I0216 22:27:24.631178 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:24 crc kubenswrapper[4777]: I0216 22:27:24.631836 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:24 crc kubenswrapper[4777]: I0216 22:27:24.719551 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.276532 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.456086 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznjl\" (UniqueName: \"kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl\") pod \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.456236 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content\") pod \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.456316 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities\") pod \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\" (UID: \"91886aa4-cc59-404d-9ef6-a4062f63b3c2\") " Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.458184 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities" (OuterVolumeSpecName: "utilities") pod "91886aa4-cc59-404d-9ef6-a4062f63b3c2" (UID: "91886aa4-cc59-404d-9ef6-a4062f63b3c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.466045 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl" (OuterVolumeSpecName: "kube-api-access-zznjl") pod "91886aa4-cc59-404d-9ef6-a4062f63b3c2" (UID: "91886aa4-cc59-404d-9ef6-a4062f63b3c2"). InnerVolumeSpecName "kube-api-access-zznjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.510243 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91886aa4-cc59-404d-9ef6-a4062f63b3c2" (UID: "91886aa4-cc59-404d-9ef6-a4062f63b3c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.560553 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.560601 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91886aa4-cc59-404d-9ef6-a4062f63b3c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.560622 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznjl\" (UniqueName: \"kubernetes.io/projected/91886aa4-cc59-404d-9ef6-a4062f63b3c2-kube-api-access-zznjl\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.589070 4777 generic.go:334] "Generic (PLEG): container finished" podID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerID="f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033" exitCode=0 Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.589157 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htc2k" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.589180 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerDied","Data":"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033"} Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.589254 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htc2k" event={"ID":"91886aa4-cc59-404d-9ef6-a4062f63b3c2","Type":"ContainerDied","Data":"60c020aac55dcd4ebf386758403a8c0ef858ab8bf6769dc3a27fcc6ed92da759"} Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.589278 4777 scope.go:117] "RemoveContainer" containerID="f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.615068 4777 scope.go:117] "RemoveContainer" containerID="1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.635639 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.645676 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-htc2k"] Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.657090 4777 scope.go:117] "RemoveContainer" containerID="9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.674587 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.697424 4777 scope.go:117] "RemoveContainer" containerID="f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033" Feb 16 22:27:25 crc kubenswrapper[4777]: E0216 22:27:25.698180 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033\": container with ID starting with f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033 not found: ID does not exist" containerID="f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.698253 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033"} err="failed to get container status \"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033\": rpc error: code = NotFound desc = could not find container \"f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033\": container with ID starting with f8ef9cadc836395f3842366f90d6ad7745a8cdf03806a5e096b9c54d7fb4c033 not found: ID does not exist" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.698283 4777 scope.go:117] "RemoveContainer" containerID="1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910" Feb 16 22:27:25 crc kubenswrapper[4777]: E0216 22:27:25.699064 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910\": container with ID starting with 1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910 not found: ID does not exist" containerID="1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.699136 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910"} err="failed to get container status \"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910\": rpc error: code = NotFound desc = could not find container \"1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910\": container with ID starting with 1e5681b708c34f0d9207ab47a4fde343251a1878d18d747128bc7afeed5bb910 not found: ID does not exist" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.699182 4777 scope.go:117] "RemoveContainer" containerID="9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62" Feb 16 22:27:25 crc kubenswrapper[4777]: E0216 22:27:25.699666 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62\": container with ID starting with 9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62 not found: ID does not exist" containerID="9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62" Feb 16 22:27:25 crc kubenswrapper[4777]: I0216 22:27:25.699748 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62"} err="failed to get container status \"9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62\": rpc error: code = NotFound desc = could not find container \"9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62\": container with ID starting with 9f43f39c2675a3668b1152fd8a77a10f6ddcc353069cd0fdf4296d7304666d62 not found: ID does not exist" Feb 16 22:27:26 crc kubenswrapper[4777]: I0216 22:27:26.197001 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" path="/var/lib/kubelet/pods/91886aa4-cc59-404d-9ef6-a4062f63b3c2/volumes" Feb 16 22:27:27 crc kubenswrapper[4777]: I0216 22:27:27.976812 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.631400 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66tm6" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="registry-server" containerID="cri-o://1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd" gracePeriod=2 Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.823997 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tfd5/must-gather-wzvlc"] Feb 16 22:27:28 crc kubenswrapper[4777]: E0216 22:27:28.830402 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="extract-content" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.830484 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="extract-content" Feb 16 22:27:28 crc kubenswrapper[4777]: E0216 22:27:28.830543 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="registry-server" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.830598 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="registry-server" Feb 16 22:27:28 crc kubenswrapper[4777]: E0216 22:27:28.830669 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="extract-utilities" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.830745 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="extract-utilities" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.830992 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="91886aa4-cc59-404d-9ef6-a4062f63b3c2" containerName="registry-server" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.832109 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.839747 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8tfd5"/"openshift-service-ca.crt" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.858946 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8tfd5"/"kube-root-ca.crt" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.862031 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tfd5/must-gather-wzvlc"] Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.948867 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwgdg\" (UniqueName: \"kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:28 crc kubenswrapper[4777]: I0216 22:27:28.949183 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.051082 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgdg\" (UniqueName: \"kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.051176 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.051552 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.072611 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwgdg\" (UniqueName: \"kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg\") pod \"must-gather-wzvlc\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.159141 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.430591 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.572570 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6sq\" (UniqueName: \"kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq\") pod \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.572945 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content\") pod \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.573169 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities\") pod \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\" (UID: \"5ae5d7d5-f64c-4a9e-81fb-60664123e692\") " Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.577286 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities" (OuterVolumeSpecName: "utilities") pod "5ae5d7d5-f64c-4a9e-81fb-60664123e692" (UID: "5ae5d7d5-f64c-4a9e-81fb-60664123e692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.578884 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq" (OuterVolumeSpecName: "kube-api-access-sb6sq") pod "5ae5d7d5-f64c-4a9e-81fb-60664123e692" (UID: "5ae5d7d5-f64c-4a9e-81fb-60664123e692"). InnerVolumeSpecName "kube-api-access-sb6sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.594270 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ae5d7d5-f64c-4a9e-81fb-60664123e692" (UID: "5ae5d7d5-f64c-4a9e-81fb-60664123e692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.643404 4777 generic.go:334] "Generic (PLEG): container finished" podID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerID="1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd" exitCode=0 Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.643455 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerDied","Data":"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd"} Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.643485 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66tm6" event={"ID":"5ae5d7d5-f64c-4a9e-81fb-60664123e692","Type":"ContainerDied","Data":"e93dacfc8126897c406b1600326ab7aaef78bda15e20a4298873a67d8d08aeef"} Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.643502 4777 scope.go:117] "RemoveContainer" containerID="1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.643557 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66tm6" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.675440 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6sq\" (UniqueName: \"kubernetes.io/projected/5ae5d7d5-f64c-4a9e-81fb-60664123e692-kube-api-access-sb6sq\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.678080 4777 scope.go:117] "RemoveContainer" containerID="8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.678299 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.678341 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae5d7d5-f64c-4a9e-81fb-60664123e692-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.686662 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.695511 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66tm6"] Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.699431 4777 scope.go:117] "RemoveContainer" containerID="d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.726618 4777 scope.go:117] "RemoveContainer" containerID="1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd" Feb 16 22:27:29 crc kubenswrapper[4777]: E0216 22:27:29.729195 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd\": container with ID starting with 1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd not found: ID does not exist" containerID="1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.729243 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd"} err="failed to get container status \"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd\": rpc error: code = NotFound desc = could not find container \"1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd\": container with ID starting with 1d6b46013131101e761e009ef6b072bd8a2ddb30f63377dbc30cc640ed2b94fd not found: ID does not exist" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.729280 4777 scope.go:117] "RemoveContainer" containerID="8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f" Feb 16 22:27:29 crc kubenswrapper[4777]: E0216 22:27:29.729621 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f\": container with ID starting with 8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f not found: ID does not exist" containerID="8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.729734 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f"} err="failed to get container status \"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f\": rpc error: code = NotFound desc = could not find container \"8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f\": container with ID starting with 8a8840e8dcab6b909b8e22b298611da53e005a5a1429319177f6bf3626bf867f not found: ID does not exist" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.729809 4777 scope.go:117] "RemoveContainer" containerID="d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87" Feb 16 22:27:29 crc kubenswrapper[4777]: E0216 22:27:29.730070 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87\": container with ID starting with d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87 not found: ID does not exist" containerID="d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.730144 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87"} err="failed to get container status \"d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87\": rpc error: code = NotFound desc = could not find container \"d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87\": container with ID starting with d2cbde2f4d5d33117abbbb1fa748f265ee95c4afd1d99fb539d3bb2db32cdc87 not found: ID does not exist" Feb 16 22:27:29 crc kubenswrapper[4777]: I0216 22:27:29.783816 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tfd5/must-gather-wzvlc"] Feb 16 22:27:30 crc kubenswrapper[4777]: I0216 22:27:30.195230 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" path="/var/lib/kubelet/pods/5ae5d7d5-f64c-4a9e-81fb-60664123e692/volumes" Feb 16 22:27:30 crc kubenswrapper[4777]: I0216 22:27:30.659315 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" event={"ID":"25a628c7-c0f3-4385-a217-4a37c82197b8","Type":"ContainerStarted","Data":"c35af878a8a6b40ff43f20acdb8dd71a1c54ef3d18b854e66db5628fa0c5b48e"} Feb 16 22:27:36 crc kubenswrapper[4777]: E0216 22:27:36.185307 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:27:36 crc kubenswrapper[4777]: I0216 22:27:36.722322 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" event={"ID":"25a628c7-c0f3-4385-a217-4a37c82197b8","Type":"ContainerStarted","Data":"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96"} Feb 16 22:27:37 crc kubenswrapper[4777]: I0216 22:27:37.735043 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" event={"ID":"25a628c7-c0f3-4385-a217-4a37c82197b8","Type":"ContainerStarted","Data":"38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911"} Feb 16 22:27:37 crc kubenswrapper[4777]: I0216 22:27:37.766899 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" podStartSLOduration=3.136747944 podStartE2EDuration="9.766879951s" podCreationTimestamp="2026-02-16 22:27:28 +0000 UTC" firstStartedPulling="2026-02-16 22:27:29.780246377 +0000 UTC m=+2970.362747479" lastFinishedPulling="2026-02-16 22:27:36.410378374 +0000 UTC m=+2976.992879486" observedRunningTime="2026-02-16 22:27:37.757010574 +0000 UTC m=+2978.339511676" watchObservedRunningTime="2026-02-16 22:27:37.766879951 +0000 UTC m=+2978.349381053" Feb 16 22:27:41 crc kubenswrapper[4777]: E0216 22:27:41.088589 4777 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.203:33022->38.102.83.203:33935: read tcp 38.102.83.203:33022->38.102.83.203:33935: read: connection reset by peer Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.651358 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.651420 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.651465 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.652259 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.652317 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" gracePeriod=600 Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.812930 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" exitCode=0 Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.813187 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7"} Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.813223 4777 scope.go:117] "RemoveContainer" containerID="b5345f8b5c96b1c162c761b1ad77770b60a8fb2735ec3a2536797a7f5ed293d9" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.950804 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-6gb8t"] Feb 16 22:27:41 crc kubenswrapper[4777]: E0216 22:27:41.951460 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="extract-utilities" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.951529 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="extract-utilities" Feb 16 22:27:41 crc kubenswrapper[4777]: E0216 22:27:41.951591 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="extract-content" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.951647 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="extract-content" Feb 16 22:27:41 crc kubenswrapper[4777]: E0216 22:27:41.951729 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="registry-server" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.951790 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="registry-server" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.952041 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae5d7d5-f64c-4a9e-81fb-60664123e692" containerName="registry-server" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.952773 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:41 crc kubenswrapper[4777]: I0216 22:27:41.954396 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8tfd5"/"default-dockercfg-fdfxv" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.063960 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzmd\" (UniqueName: \"kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.064025 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.165860 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzmd\" (UniqueName: \"kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.165917 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.166077 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: E0216 22:27:42.175304 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.188908 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzmd\" (UniqueName: \"kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd\") pod \"crc-debug-6gb8t\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.274175 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.827236 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:27:42 crc kubenswrapper[4777]: E0216 22:27:42.827907 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:27:42 crc kubenswrapper[4777]: I0216 22:27:42.827988 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" event={"ID":"b3ea978c-7cf9-47c7-b71e-a7d3368af195","Type":"ContainerStarted","Data":"83b1e94ba037f7b4d681ae24106af648f2f84fbccdd655aa78f4fe3dec1c892c"} Feb 16 22:27:50 crc kubenswrapper[4777]: E0216 22:27:50.190884 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:27:53 crc kubenswrapper[4777]: I0216 22:27:53.927860 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" event={"ID":"b3ea978c-7cf9-47c7-b71e-a7d3368af195","Type":"ContainerStarted","Data":"e244ca38ff8ec21d0ec5b9ec931bd4f5dd5a51f1a8f8edbc5b6cb9e954fd5cf1"} Feb 16 22:27:53 crc kubenswrapper[4777]: I0216 22:27:53.948855 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" podStartSLOduration=1.614333078 podStartE2EDuration="12.948829557s" podCreationTimestamp="2026-02-16 22:27:41 +0000 UTC" firstStartedPulling="2026-02-16 22:27:42.320112952 +0000 UTC m=+2982.902614094" lastFinishedPulling="2026-02-16 22:27:53.654609471 +0000 UTC m=+2994.237110573" observedRunningTime="2026-02-16 22:27:53.940578935 +0000 UTC m=+2994.523080037" watchObservedRunningTime="2026-02-16 22:27:53.948829557 +0000 UTC m=+2994.531331039" Feb 16 22:27:58 crc kubenswrapper[4777]: I0216 22:27:58.181973 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:27:58 crc kubenswrapper[4777]: E0216 22:27:58.182945 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:28:03 crc kubenswrapper[4777]: E0216 22:28:03.187826 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:28:09 crc kubenswrapper[4777]: I0216 22:28:09.077201 4777 generic.go:334] "Generic (PLEG): container finished" podID="b3ea978c-7cf9-47c7-b71e-a7d3368af195" containerID="e244ca38ff8ec21d0ec5b9ec931bd4f5dd5a51f1a8f8edbc5b6cb9e954fd5cf1" exitCode=0 Feb 16 22:28:09 crc kubenswrapper[4777]: I0216 22:28:09.077271 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" event={"ID":"b3ea978c-7cf9-47c7-b71e-a7d3368af195","Type":"ContainerDied","Data":"e244ca38ff8ec21d0ec5b9ec931bd4f5dd5a51f1a8f8edbc5b6cb9e954fd5cf1"} Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.237776 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.267682 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-6gb8t"] Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.286358 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-6gb8t"] Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.375901 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzmd\" (UniqueName: \"kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd\") pod \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.376075 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host\") pod \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\" (UID: \"b3ea978c-7cf9-47c7-b71e-a7d3368af195\") " Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.376447 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host" (OuterVolumeSpecName: "host") pod "b3ea978c-7cf9-47c7-b71e-a7d3368af195" (UID: "b3ea978c-7cf9-47c7-b71e-a7d3368af195"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.380115 4777 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3ea978c-7cf9-47c7-b71e-a7d3368af195-host\") on node \"crc\" DevicePath \"\"" Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.387894 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd" (OuterVolumeSpecName: "kube-api-access-dzzmd") pod "b3ea978c-7cf9-47c7-b71e-a7d3368af195" (UID: "b3ea978c-7cf9-47c7-b71e-a7d3368af195"). InnerVolumeSpecName "kube-api-access-dzzmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:28:10 crc kubenswrapper[4777]: I0216 22:28:10.481857 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzzmd\" (UniqueName: \"kubernetes.io/projected/b3ea978c-7cf9-47c7-b71e-a7d3368af195-kube-api-access-dzzmd\") on node \"crc\" DevicePath \"\"" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.098091 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b1e94ba037f7b4d681ae24106af648f2f84fbccdd655aa78f4fe3dec1c892c" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.098158 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-6gb8t" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.505499 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-zj65d"] Feb 16 22:28:11 crc kubenswrapper[4777]: E0216 22:28:11.506217 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ea978c-7cf9-47c7-b71e-a7d3368af195" containerName="container-00" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.506229 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ea978c-7cf9-47c7-b71e-a7d3368af195" containerName="container-00" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.506447 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ea978c-7cf9-47c7-b71e-a7d3368af195" containerName="container-00" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.507154 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.509966 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8tfd5"/"default-dockercfg-fdfxv" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.604826 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4gr\" (UniqueName: \"kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.604938 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.706869 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4gr\" (UniqueName: \"kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.706941 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.707128 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.724380 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4gr\" (UniqueName: \"kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr\") pod \"crc-debug-zj65d\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: I0216 22:28:11.823423 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:11 crc kubenswrapper[4777]: W0216 22:28:11.848299 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d25f92_6f74_4c95_a2ca_72cec4bba97c.slice/crio-cb79eb86f382c412dd94a1c73961546dcf8e181cadb6e2e6a16cf7e1b46232ba WatchSource:0}: Error finding container cb79eb86f382c412dd94a1c73961546dcf8e181cadb6e2e6a16cf7e1b46232ba: Status 404 returned error can't find the container with id cb79eb86f382c412dd94a1c73961546dcf8e181cadb6e2e6a16cf7e1b46232ba Feb 16 22:28:12 crc kubenswrapper[4777]: I0216 22:28:12.116444 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" event={"ID":"02d25f92-6f74-4c95-a2ca-72cec4bba97c","Type":"ContainerStarted","Data":"cb79eb86f382c412dd94a1c73961546dcf8e181cadb6e2e6a16cf7e1b46232ba"} Feb 16 22:28:12 crc kubenswrapper[4777]: I0216 22:28:12.182332 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:28:12 crc kubenswrapper[4777]: E0216 22:28:12.182581 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:28:12 crc kubenswrapper[4777]: I0216 22:28:12.192186 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ea978c-7cf9-47c7-b71e-a7d3368af195" path="/var/lib/kubelet/pods/b3ea978c-7cf9-47c7-b71e-a7d3368af195/volumes" Feb 16 22:28:13 crc kubenswrapper[4777]: I0216 22:28:13.127064 4777 generic.go:334] "Generic (PLEG): container finished" podID="02d25f92-6f74-4c95-a2ca-72cec4bba97c" containerID="23c3c98d1429958705ac48123d956d421618fd1fb5a11e1a7658402c0f2a9795" exitCode=1 Feb 16 22:28:13 crc kubenswrapper[4777]: I0216 22:28:13.127178 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" event={"ID":"02d25f92-6f74-4c95-a2ca-72cec4bba97c","Type":"ContainerDied","Data":"23c3c98d1429958705ac48123d956d421618fd1fb5a11e1a7658402c0f2a9795"} Feb 16 22:28:13 crc kubenswrapper[4777]: I0216 22:28:13.159475 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-zj65d"] Feb 16 22:28:13 crc kubenswrapper[4777]: I0216 22:28:13.168527 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8tfd5/crc-debug-zj65d"] Feb 16 22:28:14 crc kubenswrapper[4777]: E0216 22:28:14.183006 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.274314 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.396075 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc4gr\" (UniqueName: \"kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr\") pod \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.396190 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host\") pod \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\" (UID: \"02d25f92-6f74-4c95-a2ca-72cec4bba97c\") " Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.396268 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host" (OuterVolumeSpecName: "host") pod "02d25f92-6f74-4c95-a2ca-72cec4bba97c" (UID: "02d25f92-6f74-4c95-a2ca-72cec4bba97c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.396852 4777 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02d25f92-6f74-4c95-a2ca-72cec4bba97c-host\") on node \"crc\" DevicePath \"\"" Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.409000 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr" (OuterVolumeSpecName: "kube-api-access-rc4gr") pod "02d25f92-6f74-4c95-a2ca-72cec4bba97c" (UID: "02d25f92-6f74-4c95-a2ca-72cec4bba97c"). InnerVolumeSpecName "kube-api-access-rc4gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:28:14 crc kubenswrapper[4777]: I0216 22:28:14.498347 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc4gr\" (UniqueName: \"kubernetes.io/projected/02d25f92-6f74-4c95-a2ca-72cec4bba97c-kube-api-access-rc4gr\") on node \"crc\" DevicePath \"\"" Feb 16 22:28:15 crc kubenswrapper[4777]: I0216 22:28:15.149461 4777 scope.go:117] "RemoveContainer" containerID="23c3c98d1429958705ac48123d956d421618fd1fb5a11e1a7658402c0f2a9795" Feb 16 22:28:15 crc kubenswrapper[4777]: I0216 22:28:15.149831 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/crc-debug-zj65d" Feb 16 22:28:16 crc kubenswrapper[4777]: I0216 22:28:16.202274 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d25f92-6f74-4c95-a2ca-72cec4bba97c" path="/var/lib/kubelet/pods/02d25f92-6f74-4c95-a2ca-72cec4bba97c/volumes" Feb 16 22:28:23 crc kubenswrapper[4777]: I0216 22:28:23.181764 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:28:23 crc kubenswrapper[4777]: E0216 22:28:23.182780 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:28:27 crc kubenswrapper[4777]: E0216 22:28:27.184575 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:28:38 crc kubenswrapper[4777]: I0216 22:28:38.182047 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:28:38 crc kubenswrapper[4777]: E0216 22:28:38.183788 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:28:41 crc kubenswrapper[4777]: E0216 22:28:41.184143 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:28:52 crc kubenswrapper[4777]: E0216 22:28:52.183505 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:28:53 crc kubenswrapper[4777]: I0216 22:28:53.181973 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:28:53 crc kubenswrapper[4777]: E0216 22:28:53.182389 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:29:05 crc kubenswrapper[4777]: E0216 22:29:05.186160 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:29:06 crc kubenswrapper[4777]: I0216 22:29:06.186311 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:29:06 crc kubenswrapper[4777]: E0216 22:29:06.186577 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.020069 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_29233f83-ffa4-4cbf-bc7e-435e4b44cd5e/init-config-reloader/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.188817 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_29233f83-ffa4-4cbf-bc7e-435e4b44cd5e/init-config-reloader/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.384650 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_29233f83-ffa4-4cbf-bc7e-435e4b44cd5e/alertmanager/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.394109 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_29233f83-ffa4-4cbf-bc7e-435e4b44cd5e/config-reloader/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.553329 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b898f6696-7cphs_fcf8e71c-6c4e-4724-8d06-46aed549c48f/barbican-api-log/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.593122 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b898f6696-7cphs_fcf8e71c-6c4e-4724-8d06-46aed549c48f/barbican-api/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.634292 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5494fbd488-n9m64_3341e79f-f4de-47f5-8cc8-595f1b4fb837/barbican-keystone-listener/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.756377 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5494fbd488-n9m64_3341e79f-f4de-47f5-8cc8-595f1b4fb837/barbican-keystone-listener-log/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.758148 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6464bc7759-rjmxp_dcf235a4-5b60-4771-9654-1a02d4a9144d/barbican-worker/0.log" Feb 16 22:29:07 crc kubenswrapper[4777]: I0216 22:29:07.843829 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6464bc7759-rjmxp_dcf235a4-5b60-4771-9654-1a02d4a9144d/barbican-worker-log/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.020952 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbefa3e5-4388-4b9e-9c93-6580727d021d/ceilometer-central-agent/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.068244 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbefa3e5-4388-4b9e-9c93-6580727d021d/proxy-httpd/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.109298 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbefa3e5-4388-4b9e-9c93-6580727d021d/ceilometer-notification-agent/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.249351 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fbefa3e5-4388-4b9e-9c93-6580727d021d/sg-core/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.276210 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3e6e9a17-49c6-4977-af5d-f21441665952/cinder-api/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.317735 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3e6e9a17-49c6-4977-af5d-f21441665952/cinder-api-log/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.497723 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42ee3374-a9ec-4419-a498-aee6852b4ade/probe/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.502951 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42ee3374-a9ec-4419-a498-aee6852b4ade/cinder-scheduler/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.709266 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_ae4b796b-cd3f-4a97-a43f-9fec28e71ac7/loki-compactor/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.833963 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-jzwq7_6bd47bb8-e12f-4cb4-a343-8f732e339484/loki-distributor/0.log" Feb 16 22:29:08 crc kubenswrapper[4777]: I0216 22:29:08.937737 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-8hh82_1d115cc4-a668-4cd2-b4ae-f812341416a6/gateway/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.034402 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-vsxfm_a24ca457-99d2-4651-9220-cfca7e58df3e/gateway/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.192851 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_c2c4ae45-a8be-4fce-a939-53cf29c33a77/loki-index-gateway/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.262491 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_a882b0c3-7f2e-446b-aea4-476cacffb112/loki-ingester/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.450650 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-7jdcb_dd2cb1a1-9d03-4791-bf4d-c93f79816e59/loki-querier/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.455263 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-csxs8_be9c8434-a8d2-4404-ad47-b9b91b21f439/loki-query-frontend/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.651085 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-qsg2h_368d5445-0b3c-4ea7-b7d8-674eaca061df/init/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.858474 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-qsg2h_368d5445-0b3c-4ea7-b7d8-674eaca061df/init/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.898045 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-qsg2h_368d5445-0b3c-4ea7-b7d8-674eaca061df/dnsmasq-dns/0.log" Feb 16 22:29:09 crc kubenswrapper[4777]: I0216 22:29:09.945149 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3dc126ff-fbc5-4c23-8a19-084e71677f29/glance-httpd/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.052955 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3dc126ff-fbc5-4c23-8a19-084e71677f29/glance-log/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.142841 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_206cc213-04d8-40c6-befb-914351a6c1fe/glance-httpd/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.161636 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_206cc213-04d8-40c6-befb-914351a6c1fe/glance-log/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.552731 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-57454d8655-v6bgf_142a3220-766a-49cc-bf87-9c879fc00222/keystone-api/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.554656 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29521321-ljtlr_f98b7fd0-6b82-4bba-ac85-6ee9afa204aa/keystone-cron/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.751927 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f49eb4b3-3728-4b56-ab7d-00460813ed7c/kube-state-metrics/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.984694 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5dd8c5d5-n2w9s_2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8/neutron-api/0.log" Feb 16 22:29:10 crc kubenswrapper[4777]: I0216 22:29:10.992781 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5dd8c5d5-n2w9s_2e1b22d3-1ea0-4f32-92be-3ab6b56a90b8/neutron-httpd/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.322987 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b23d431a-b9f1-4151-9189-f027693eaabf/nova-cell0-conductor-conductor/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.369467 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbc0152a-9610-4701-b7f9-e2ae9ddcf53a/nova-api-log/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.388113 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbc0152a-9610-4701-b7f9-e2ae9ddcf53a/nova-api-api/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.661680 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_bdf361b5-ee5d-4273-a1b8-4efe5941b4a8/nova-cell1-conductor-conductor/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.712327 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ab854724-b71f-40d0-ac17-43ddc095c3ac/nova-cell1-novncproxy-novncproxy/0.log" Feb 16 22:29:11 crc kubenswrapper[4777]: I0216 22:29:11.921188 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_376df6ac-d55a-46a9-9b11-893215a316a7/nova-metadata-log/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.181231 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_265517fa-85cd-4110-b910-69090b69be97/nova-scheduler-scheduler/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.275543 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a/mysql-bootstrap/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.470930 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a/mysql-bootstrap/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.498225 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f52bd37b-edb4-4fcb-aee8-1e1c828b9c5a/galera/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.672610 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aadef7bb-2bff-4cd3-9a56-6b42ca417c94/mysql-bootstrap/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.696320 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_376df6ac-d55a-46a9-9b11-893215a316a7/nova-metadata-metadata/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.938835 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aadef7bb-2bff-4cd3-9a56-6b42ca417c94/mysql-bootstrap/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.957210 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aadef7bb-2bff-4cd3-9a56-6b42ca417c94/galera/0.log" Feb 16 22:29:12 crc kubenswrapper[4777]: I0216 22:29:12.976336 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5ae5d18f-9104-4d22-ab01-f97681e4bbc8/openstackclient/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.452183 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9zcrw_13df94da-db3f-470b-ab61-f8ad3a4dd75d/openstack-network-exporter/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.495684 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wbfs_129ef04f-890a-41e2-936b-833a227993e5/ovsdb-server-init/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.709914 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wbfs_129ef04f-890a-41e2-936b-833a227993e5/ovs-vswitchd/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.836276 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wbfs_129ef04f-890a-41e2-936b-833a227993e5/ovsdb-server-init/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.863628 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7wbfs_129ef04f-890a-41e2-936b-833a227993e5/ovsdb-server/0.log" Feb 16 22:29:13 crc kubenswrapper[4777]: I0216 22:29:13.900472 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-q5qdr_7f871748-de9b-43dc-87df-728b12402b6b/ovn-controller/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.049913 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_087976e9-29d9-4bc1-9939-e73ef0be9e0e/openstack-network-exporter/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.140903 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_087976e9-29d9-4bc1-9939-e73ef0be9e0e/ovn-northd/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.242589 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7/openstack-network-exporter/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.291272 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_62bafcaf-aae9-46cf-bff3-3cf0f78ce7a7/ovsdbserver-nb/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.476295 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_79b13da6-0857-40ec-9c96-5a5a28c6dd69/openstack-network-exporter/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.499735 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_79b13da6-0857-40ec-9c96-5a5a28c6dd69/ovsdbserver-sb/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.661790 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57449bfc86-nk7r7_73dc4ce5-0c6e-492a-84b5-097a3defc481/placement-api/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.752629 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57449bfc86-nk7r7_73dc4ce5-0c6e-492a-84b5-097a3defc481/placement-log/0.log" Feb 16 22:29:14 crc kubenswrapper[4777]: I0216 22:29:14.861265 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_03c6545a-838a-444c-8833-871730be59a7/init-config-reloader/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.063845 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_03c6545a-838a-444c-8833-871730be59a7/config-reloader/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.079414 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_03c6545a-838a-444c-8833-871730be59a7/init-config-reloader/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.109526 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_03c6545a-838a-444c-8833-871730be59a7/thanos-sidecar/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.124307 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_03c6545a-838a-444c-8833-871730be59a7/prometheus/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.308003 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9346e4ab-e1d7-42a9-8817-850a2f84e57d/setup-container/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.548879 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4df600f3-97e1-4ac5-980b-2c42aecc5e81/setup-container/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.551531 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9346e4ab-e1d7-42a9-8817-850a2f84e57d/setup-container/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.596142 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9346e4ab-e1d7-42a9-8817-850a2f84e57d/rabbitmq/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.802837 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4df600f3-97e1-4ac5-980b-2c42aecc5e81/setup-container/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.823509 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4df600f3-97e1-4ac5-980b-2c42aecc5e81/rabbitmq/0.log" Feb 16 22:29:15 crc kubenswrapper[4777]: I0216 22:29:15.877972 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69c8b968f9-qldk7_9159fa9f-578f-4292-bf32-8acbecf95c58/proxy-httpd/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.028788 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-69c8b968f9-qldk7_9159fa9f-578f-4292-bf32-8acbecf95c58/proxy-server/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.324619 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rjx9d_f855cb15-8085-43b1-825e-e6316c580924/swift-ring-rebalance/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.376523 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/account-auditor/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.406565 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/account-reaper/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.534891 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/account-replicator/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.560982 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/container-auditor/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.571580 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/account-server/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.690528 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/container-replicator/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.748304 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/container-server/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.766440 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/container-updater/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.793052 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/object-auditor/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.954267 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/object-replicator/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.967598 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/object-expirer/0.log" Feb 16 22:29:16 crc kubenswrapper[4777]: I0216 22:29:16.977437 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/object-server/0.log" Feb 16 22:29:17 crc kubenswrapper[4777]: I0216 22:29:17.071755 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/object-updater/0.log" Feb 16 22:29:17 crc kubenswrapper[4777]: I0216 22:29:17.133073 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/rsync/0.log" Feb 16 22:29:17 crc kubenswrapper[4777]: I0216 22:29:17.201217 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3392d073-5de5-4f7e-ae87-e892f769157a/swift-recon-cron/0.log" Feb 16 22:29:18 crc kubenswrapper[4777]: I0216 22:29:18.480694 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fa6dd43f-e212-439a-b078-4a0a5c1db760/memcached/0.log" Feb 16 22:29:19 crc kubenswrapper[4777]: E0216 22:29:19.183672 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:29:20 crc kubenswrapper[4777]: I0216 22:29:20.188023 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:29:20 crc kubenswrapper[4777]: E0216 22:29:20.188318 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:29:30 crc kubenswrapper[4777]: E0216 22:29:30.195213 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:29:34 crc kubenswrapper[4777]: I0216 22:29:34.182002 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:29:34 crc kubenswrapper[4777]: E0216 22:29:34.183262 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:29:41 crc kubenswrapper[4777]: E0216 22:29:41.307379 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:29:41 crc kubenswrapper[4777]: E0216 22:29:41.307822 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:29:41 crc kubenswrapper[4777]: E0216 22:29:41.307945 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:29:41 crc kubenswrapper[4777]: E0216 22:29:41.309120 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.090948 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/util/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.249439 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/util/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.263356 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/pull/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.298334 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/pull/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.478337 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/pull/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.487589 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/util/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.493808 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02aaacdbb2cdc34212ef0d4f992a08d2443727e2a4312d7c57a1078608jrwb5_27319f1d-e680-40c0-a521-c2f388e8af6b/extract/0.log" Feb 16 22:29:42 crc kubenswrapper[4777]: I0216 22:29:42.909079 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ssfj6_f4b16894-e8b0-4476-94ae-c6112200fa5d/manager/0.log" Feb 16 22:29:43 crc kubenswrapper[4777]: I0216 22:29:43.729423 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-6km9d_7f900c19-63a0-40b9-a8c4-c51883a5087e/manager/0.log" Feb 16 22:29:43 crc kubenswrapper[4777]: I0216 22:29:43.898331 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-ss4k5_522da514-4a99-4293-8fe8-45285b4f24eb/manager/0.log" Feb 16 22:29:44 crc kubenswrapper[4777]: I0216 22:29:44.173281 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-v2m7v_3269da89-1f17-44af-8ea9-83eb493249a2/manager/0.log" Feb 16 22:29:44 crc kubenswrapper[4777]: I0216 22:29:44.321683 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-rxd87_0994c976-c6c3-42e0-9b54-e09d2fac5447/manager/0.log" Feb 16 22:29:44 crc kubenswrapper[4777]: I0216 22:29:44.507161 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-9lpfx_c6f0dee9-40e3-42d5-8651-b340e075891c/manager/0.log" Feb 16 22:29:44 crc kubenswrapper[4777]: I0216 22:29:44.652450 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-7shhv_1d6bef33-b14f-435b-aa17-dfbed9c15d86/manager/0.log" Feb 16 22:29:44 crc kubenswrapper[4777]: I0216 22:29:44.829090 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-slq9f_583dde29-d4a1-4ec2-9021-3735d6c5717b/manager/0.log" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.181507 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.182000 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-v2dbc_49e70c01-9a6c-488a-a766-a430583352fb/manager/0.log" Feb 16 22:29:45 crc kubenswrapper[4777]: E0216 22:29:45.182024 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.242261 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-znh8t_d255b1d6-45e9-4749-8252-b77b97cd940a/manager/0.log" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.560199 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-nwzcm_35deb9f9-3cf5-4f92-9ba0-5f8eac14733a/manager/0.log" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.773267 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-mg48d_c69b97e2-c7ed-4b0a-9e29-fcd457d7a453/manager/0.log" Feb 16 22:29:45 crc kubenswrapper[4777]: I0216 22:29:45.969493 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9ct28cs_3a482208-548a-47dc-aa40-c23cbcdf5363/manager/0.log" Feb 16 22:29:46 crc kubenswrapper[4777]: I0216 22:29:46.485663 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7845fcf9cf-qf9zn_df21ebae-039c-40c9-9b98-2ecb22a01bdb/operator/0.log" Feb 16 22:29:46 crc kubenswrapper[4777]: I0216 22:29:46.677073 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ktl5l_5250d17d-4d3e-4b18-a738-04ec8120dfbb/registry-server/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.017512 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-k4297_b455e5b7-ca2c-4f12-91de-da9969199670/manager/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.242956 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-p9lqg_c853fc19-52e2-407b-ac21-638cdf255085/manager/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.410490 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-x2pd2_9fbfe46d-cddb-41eb-920a-0859f6fcbb27/manager/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.436074 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-w4ld6_7afb9e38-ac2c-483f-a98e-716bfa22ee6d/operator/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.574266 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-9c8f544df-6dkb7_c1a0562b-8960-485e-9c6a-da5a73a18180/manager/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.652999 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-g8lbd_1b0e41ce-7bfe-4c18-a82f-eecae3df4d5d/manager/0.log" Feb 16 22:29:47 crc kubenswrapper[4777]: I0216 22:29:47.903709 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-mzrs2_20881f3c-e228-4c63-9eb9-dbedd55ffc14/manager/0.log" Feb 16 22:29:48 crc kubenswrapper[4777]: I0216 22:29:48.077092 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-b7tpr_a565ae46-9b4e-4310-8867-230d4a65eea5/manager/0.log" Feb 16 22:29:48 crc kubenswrapper[4777]: I0216 22:29:48.231787 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-79996fd568-n7rmk_1be584eb-83c6-4709-8f10-60474c6f3f93/manager/0.log" Feb 16 22:29:49 crc kubenswrapper[4777]: I0216 22:29:49.624090 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-k22zl_367fde39-a8cd-443a-83f8-f9cb0e9f3576/manager/0.log" Feb 16 22:29:55 crc kubenswrapper[4777]: E0216 22:29:55.183514 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.154425 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg"] Feb 16 22:30:00 crc kubenswrapper[4777]: E0216 22:30:00.155227 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d25f92-6f74-4c95-a2ca-72cec4bba97c" containerName="container-00" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.155238 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d25f92-6f74-4c95-a2ca-72cec4bba97c" containerName="container-00" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.155490 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d25f92-6f74-4c95-a2ca-72cec4bba97c" containerName="container-00" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.156274 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.159155 4777 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.162506 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg"] Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.180915 4777 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.188293 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:30:00 crc kubenswrapper[4777]: E0216 22:30:00.188814 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.289003 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88h7\" (UniqueName: \"kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.289323 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.289878 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.391512 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.391734 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.391819 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88h7\" (UniqueName: \"kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.392648 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.404359 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.410464 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88h7\" (UniqueName: \"kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7\") pod \"collect-profiles-29521350-fsrqg\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.487055 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:00 crc kubenswrapper[4777]: I0216 22:30:00.946582 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg"] Feb 16 22:30:00 crc kubenswrapper[4777]: W0216 22:30:00.949840 4777 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3058a9c3_0db9_4854_9c91_ba8b79a1c46b.slice/crio-aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1 WatchSource:0}: Error finding container aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1: Status 404 returned error can't find the container with id aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1 Feb 16 22:30:01 crc kubenswrapper[4777]: I0216 22:30:01.572049 4777 generic.go:334] "Generic (PLEG): container finished" podID="3058a9c3-0db9-4854-9c91-ba8b79a1c46b" containerID="633ff63286da38567c0723d13e9f648d65bdc1dd84fa2fe67502dfd9ca3e506e" exitCode=0 Feb 16 22:30:01 crc kubenswrapper[4777]: I0216 22:30:01.572127 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" event={"ID":"3058a9c3-0db9-4854-9c91-ba8b79a1c46b","Type":"ContainerDied","Data":"633ff63286da38567c0723d13e9f648d65bdc1dd84fa2fe67502dfd9ca3e506e"} Feb 16 22:30:01 crc kubenswrapper[4777]: I0216 22:30:01.572339 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" event={"ID":"3058a9c3-0db9-4854-9c91-ba8b79a1c46b","Type":"ContainerStarted","Data":"aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1"} Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.044108 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.161158 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume\") pod \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.161461 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88h7\" (UniqueName: \"kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7\") pod \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.161584 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume\") pod \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\" (UID: \"3058a9c3-0db9-4854-9c91-ba8b79a1c46b\") " Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.162462 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3058a9c3-0db9-4854-9c91-ba8b79a1c46b" (UID: "3058a9c3-0db9-4854-9c91-ba8b79a1c46b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.167098 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7" (OuterVolumeSpecName: "kube-api-access-w88h7") pod "3058a9c3-0db9-4854-9c91-ba8b79a1c46b" (UID: "3058a9c3-0db9-4854-9c91-ba8b79a1c46b"). InnerVolumeSpecName "kube-api-access-w88h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.167219 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3058a9c3-0db9-4854-9c91-ba8b79a1c46b" (UID: "3058a9c3-0db9-4854-9c91-ba8b79a1c46b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.264012 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88h7\" (UniqueName: \"kubernetes.io/projected/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-kube-api-access-w88h7\") on node \"crc\" DevicePath \"\"" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.264044 4777 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.264053 4777 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3058a9c3-0db9-4854-9c91-ba8b79a1c46b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.596295 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" event={"ID":"3058a9c3-0db9-4854-9c91-ba8b79a1c46b","Type":"ContainerDied","Data":"aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1"} Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.596337 4777 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeec04681e66191ac147c4a8377c21cccfe13293bc612dc77c59d601ce06c0b1" Feb 16 22:30:03 crc kubenswrapper[4777]: I0216 22:30:03.596343 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521350-fsrqg" Feb 16 22:30:04 crc kubenswrapper[4777]: I0216 22:30:04.112049 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt"] Feb 16 22:30:04 crc kubenswrapper[4777]: I0216 22:30:04.119103 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521305-cwvkt"] Feb 16 22:30:04 crc kubenswrapper[4777]: I0216 22:30:04.197495 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81e09cd-794a-4752-afa0-f151859cfdd6" path="/var/lib/kubelet/pods/c81e09cd-794a-4752-afa0-f151859cfdd6/volumes" Feb 16 22:30:05 crc kubenswrapper[4777]: I0216 22:30:05.490824 4777 scope.go:117] "RemoveContainer" containerID="b3ec78363415f9ff565ab426256289768308a677b858101872808bbd73b8c28a" Feb 16 22:30:09 crc kubenswrapper[4777]: I0216 22:30:09.148433 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7fgrb_21b9331b-c0a6-4a19-a056-93eb683693df/control-plane-machine-set-operator/0.log" Feb 16 22:30:09 crc kubenswrapper[4777]: E0216 22:30:09.183843 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:30:09 crc kubenswrapper[4777]: I0216 22:30:09.355772 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r5cl6_5615afd6-9159-4ccc-b08d-4305b9b792bb/machine-api-operator/0.log" Feb 16 22:30:09 crc kubenswrapper[4777]: I0216 22:30:09.370225 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r5cl6_5615afd6-9159-4ccc-b08d-4305b9b792bb/kube-rbac-proxy/0.log" Feb 16 22:30:11 crc kubenswrapper[4777]: I0216 22:30:11.181364 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:30:11 crc kubenswrapper[4777]: E0216 22:30:11.181746 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:30:23 crc kubenswrapper[4777]: E0216 22:30:23.183539 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:30:24 crc kubenswrapper[4777]: I0216 22:30:24.861048 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kgs5x_658c82ce-4722-4465-a72b-f9d8234c286d/cert-manager-controller/0.log" Feb 16 22:30:25 crc kubenswrapper[4777]: I0216 22:30:25.084395 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-lsvsn_5f765dd2-4db9-46b8-8914-62e18e339d59/cert-manager-cainjector/0.log" Feb 16 22:30:25 crc kubenswrapper[4777]: I0216 22:30:25.139066 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-6q4m2_4591e319-88ea-472b-8adc-1aa253262a37/cert-manager-webhook/0.log" Feb 16 22:30:26 crc kubenswrapper[4777]: I0216 22:30:26.181789 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:30:26 crc kubenswrapper[4777]: E0216 22:30:26.182034 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:30:37 crc kubenswrapper[4777]: I0216 22:30:37.182356 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:30:37 crc kubenswrapper[4777]: E0216 22:30:37.183041 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:30:37 crc kubenswrapper[4777]: E0216 22:30:37.185285 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:30:38 crc kubenswrapper[4777]: I0216 22:30:38.910523 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-pvxhd_25b13f2e-0ccf-4c04-96ae-96da39ecf652/nmstate-console-plugin/0.log" Feb 16 22:30:39 crc kubenswrapper[4777]: I0216 22:30:39.065200 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-d969w_1572cd79-8c35-4985-ad71-685580878c04/kube-rbac-proxy/0.log" Feb 16 22:30:39 crc kubenswrapper[4777]: I0216 22:30:39.117885 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-klpn2_5b920bcb-4d07-48db-b7bd-78ce2481fa08/nmstate-handler/0.log" Feb 16 22:30:39 crc kubenswrapper[4777]: I0216 22:30:39.194627 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-d969w_1572cd79-8c35-4985-ad71-685580878c04/nmstate-metrics/0.log" Feb 16 22:30:39 crc kubenswrapper[4777]: I0216 22:30:39.253698 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-2mzfn_cac8ca17-cdb3-4745-a133-f197869914b4/nmstate-operator/0.log" Feb 16 22:30:39 crc kubenswrapper[4777]: I0216 22:30:39.350496 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-9b2zr_1c793dcc-4939-49c3-b6e3-6866f23ae0ae/nmstate-webhook/0.log" Feb 16 22:30:50 crc kubenswrapper[4777]: I0216 22:30:50.188319 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:30:50 crc kubenswrapper[4777]: E0216 22:30:50.189025 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:30:51 crc kubenswrapper[4777]: E0216 22:30:51.184958 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:30:53 crc kubenswrapper[4777]: I0216 22:30:53.284628 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b545f4d58-l9tqd_94269fb5-9977-4c2c-aa87-1fe5a841dc8d/manager/0.log" Feb 16 22:30:53 crc kubenswrapper[4777]: I0216 22:30:53.315757 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b545f4d58-l9tqd_94269fb5-9977-4c2c-aa87-1fe5a841dc8d/kube-rbac-proxy/0.log" Feb 16 22:31:02 crc kubenswrapper[4777]: I0216 22:31:02.185583 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:31:02 crc kubenswrapper[4777]: E0216 22:31:02.186542 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:31:02 crc kubenswrapper[4777]: E0216 22:31:02.186913 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:31:07 crc kubenswrapper[4777]: I0216 22:31:07.211827 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7nx5h_48b27685-17d6-45ee-a71a-965bac61e90c/prometheus-operator/0.log" Feb 16 22:31:07 crc kubenswrapper[4777]: I0216 22:31:07.333992 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc_5f5e98cb-6704-49a9-93b5-4158dbddbb58/prometheus-operator-admission-webhook/0.log" Feb 16 22:31:07 crc kubenswrapper[4777]: I0216 22:31:07.394067 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b_c2deed2d-e90d-4062-817f-524d5413831d/prometheus-operator-admission-webhook/0.log" Feb 16 22:31:07 crc kubenswrapper[4777]: I0216 22:31:07.520647 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4ljm7_995fa122-9f83-4b7c-97ce-4e1cfeb76b29/operator/0.log" Feb 16 22:31:07 crc kubenswrapper[4777]: I0216 22:31:07.586318 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-zktjn_0997c6ee-fff0-48ed-8234-72a4dde5f326/perses-operator/0.log" Feb 16 22:31:13 crc kubenswrapper[4777]: I0216 22:31:13.181751 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:31:13 crc kubenswrapper[4777]: E0216 22:31:13.182762 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:31:15 crc kubenswrapper[4777]: E0216 22:31:15.185760 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.296376 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-q4xqv_3c25c0bd-85a5-4a21-9881-81663f96e5c8/kube-rbac-proxy/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.397749 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-q4xqv_3c25c0bd-85a5-4a21-9881-81663f96e5c8/controller/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.489987 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-frr-files/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.654432 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-frr-files/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.656376 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-metrics/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.681464 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-reloader/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.702784 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-reloader/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.824440 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-reloader/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.836963 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-frr-files/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.890230 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-metrics/0.log" Feb 16 22:31:22 crc kubenswrapper[4777]: I0216 22:31:22.894995 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-metrics/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.068167 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-metrics/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.074990 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-frr-files/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.110194 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/cp-reloader/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.141183 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/controller/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.242076 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/frr-metrics/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.296393 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/kube-rbac-proxy/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.364321 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/kube-rbac-proxy-frr/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.462104 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/reloader/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.592647 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-wb5ft_3a4b98ef-eeb8-4840-85e3-4b9ac5040e27/frr-k8s-webhook-server/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.819597 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b84c748cb-vs9wb_64191e35-ca2c-4f43-8590-792843cfd6b7/manager/0.log" Feb 16 22:31:23 crc kubenswrapper[4777]: I0216 22:31:23.921322 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6895df4d58-8w2wq_79b8e02d-175c-41be-8a12-6fb9c2da4107/webhook-server/0.log" Feb 16 22:31:24 crc kubenswrapper[4777]: I0216 22:31:24.026584 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q8kgx_3e67f384-5475-4237-bc1f-b7781bd0c8eb/kube-rbac-proxy/0.log" Feb 16 22:31:24 crc kubenswrapper[4777]: I0216 22:31:24.181897 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:31:24 crc kubenswrapper[4777]: E0216 22:31:24.182174 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:31:24 crc kubenswrapper[4777]: I0216 22:31:24.240517 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pq5bx_203c943d-2dd7-4534-a22d-9954b052748f/frr/0.log" Feb 16 22:31:24 crc kubenswrapper[4777]: I0216 22:31:24.444057 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q8kgx_3e67f384-5475-4237-bc1f-b7781bd0c8eb/speaker/0.log" Feb 16 22:31:26 crc kubenswrapper[4777]: E0216 22:31:26.188235 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:31:38 crc kubenswrapper[4777]: I0216 22:31:38.181997 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:31:38 crc kubenswrapper[4777]: E0216 22:31:38.183089 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:31:38 crc kubenswrapper[4777]: I0216 22:31:38.837092 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.007977 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.024232 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.070183 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.211406 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.233748 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.243587 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651sbc78_007313eb-feab-4b33-a5f7-d0f6e842d722/extract/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.402499 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.536443 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.538908 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.539952 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.708165 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/pull/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.729777 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/util/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.736155 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p64pw_2bf34f71-118c-4f2c-9e6c-2a172dd903c4/extract/0.log" Feb 16 22:31:39 crc kubenswrapper[4777]: I0216 22:31:39.870996 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/util/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.054813 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/util/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.090249 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/pull/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.095051 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/pull/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: E0216 22:31:40.188874 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.260686 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/pull/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.264299 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/util/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.272496 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213l227t_7e933fc5-7959-4f15-a628-f8f812aa6eae/extract/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.419611 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-utilities/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.594253 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-utilities/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.617251 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-content/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.622256 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-content/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.809076 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-content/0.log" Feb 16 22:31:40 crc kubenswrapper[4777]: I0216 22:31:40.861840 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/extract-utilities/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.092849 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-utilities/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.215801 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-utilities/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.280055 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-content/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.284820 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-content/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.350007 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m65p9_ed77d1bb-44c6-4674-b17f-7c451302773a/registry-server/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.509687 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-utilities/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.532662 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/extract-content/0.log" Feb 16 22:31:41 crc kubenswrapper[4777]: I0216 22:31:41.750232 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/util/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.006916 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/util/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.039850 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/pull/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.044819 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/pull/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.051074 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9w66m_730c4cf5-a183-42de-be25-3521a01e5905/registry-server/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.217426 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/pull/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.230166 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/util/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.249584 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaxbmp5_e7e0de6f-7050-4f09-b06d-ccab19191b6d/extract/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.380887 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-58xj7_f5014a8f-fe3e-404c-a85c-6062fd0e76f7/marketplace-operator/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.415524 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-utilities/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.612315 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-utilities/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.614224 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-content/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.641415 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-content/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.830593 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-content/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.830663 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/extract-utilities/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.875177 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-utilities/0.log" Feb 16 22:31:42 crc kubenswrapper[4777]: I0216 22:31:42.924972 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-52t67_c92089ab-26b0-4604-a957-9e37cc949736/registry-server/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.082172 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-utilities/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.094797 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-content/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.121244 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-content/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.241698 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-content/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.265785 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/extract-utilities/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.639965 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:43 crc kubenswrapper[4777]: E0216 22:31:43.640759 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3058a9c3-0db9-4854-9c91-ba8b79a1c46b" containerName="collect-profiles" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.640774 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="3058a9c3-0db9-4854-9c91-ba8b79a1c46b" containerName="collect-profiles" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.640987 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="3058a9c3-0db9-4854-9c91-ba8b79a1c46b" containerName="collect-profiles" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.642506 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.651491 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.682240 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-njbxw_00b2161c-e508-4ce9-bcdd-185a02223d1c/registry-server/0.log" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.773343 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fnkp\" (UniqueName: \"kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.773772 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.773924 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.899264 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.900031 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.900812 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.900973 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fnkp\" (UniqueName: \"kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.901058 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.921410 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fnkp\" (UniqueName: \"kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp\") pod \"community-operators-nmgct\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:43 crc kubenswrapper[4777]: I0216 22:31:43.976851 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:44 crc kubenswrapper[4777]: I0216 22:31:44.684335 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:45 crc kubenswrapper[4777]: I0216 22:31:45.006878 4777 generic.go:334] "Generic (PLEG): container finished" podID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerID="cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d" exitCode=0 Feb 16 22:31:45 crc kubenswrapper[4777]: I0216 22:31:45.006934 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerDied","Data":"cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d"} Feb 16 22:31:45 crc kubenswrapper[4777]: I0216 22:31:45.007211 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerStarted","Data":"a88142fb5bbcba7865abfa3445490f4323cc11df462e76e19f53ef672763c5f3"} Feb 16 22:31:46 crc kubenswrapper[4777]: I0216 22:31:46.020843 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerStarted","Data":"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7"} Feb 16 22:31:46 crc kubenswrapper[4777]: E0216 22:31:46.744004 4777 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fe03d6_b810_4cd4_8d50_bcfbbfadb8dc.slice/crio-097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fe03d6_b810_4cd4_8d50_bcfbbfadb8dc.slice/crio-conmon-097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7.scope\": RecentStats: unable to find data in memory cache]" Feb 16 22:31:47 crc kubenswrapper[4777]: I0216 22:31:47.032131 4777 generic.go:334] "Generic (PLEG): container finished" podID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerID="097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7" exitCode=0 Feb 16 22:31:47 crc kubenswrapper[4777]: I0216 22:31:47.032200 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerDied","Data":"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7"} Feb 16 22:31:48 crc kubenswrapper[4777]: I0216 22:31:48.044347 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerStarted","Data":"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf"} Feb 16 22:31:48 crc kubenswrapper[4777]: I0216 22:31:48.069150 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nmgct" podStartSLOduration=2.579498949 podStartE2EDuration="5.069115038s" podCreationTimestamp="2026-02-16 22:31:43 +0000 UTC" firstStartedPulling="2026-02-16 22:31:45.00933386 +0000 UTC m=+3225.591834962" lastFinishedPulling="2026-02-16 22:31:47.498949919 +0000 UTC m=+3228.081451051" observedRunningTime="2026-02-16 22:31:48.063946303 +0000 UTC m=+3228.646447405" watchObservedRunningTime="2026-02-16 22:31:48.069115038 +0000 UTC m=+3228.651616180" Feb 16 22:31:50 crc kubenswrapper[4777]: I0216 22:31:50.188405 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:31:50 crc kubenswrapper[4777]: E0216 22:31:50.188875 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:31:53 crc kubenswrapper[4777]: E0216 22:31:53.185172 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:31:53 crc kubenswrapper[4777]: I0216 22:31:53.978399 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:53 crc kubenswrapper[4777]: I0216 22:31:53.978491 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:54 crc kubenswrapper[4777]: I0216 22:31:54.051324 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:54 crc kubenswrapper[4777]: I0216 22:31:54.207950 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:54 crc kubenswrapper[4777]: I0216 22:31:54.289692 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.153249 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nmgct" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="registry-server" containerID="cri-o://0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf" gracePeriod=2 Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.705322 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.820804 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities\") pod \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.821217 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fnkp\" (UniqueName: \"kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp\") pod \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.821269 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content\") pod \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\" (UID: \"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc\") " Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.821619 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities" (OuterVolumeSpecName: "utilities") pod "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" (UID: "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.822137 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.828872 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp" (OuterVolumeSpecName: "kube-api-access-6fnkp") pod "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" (UID: "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc"). InnerVolumeSpecName "kube-api-access-6fnkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:31:56 crc kubenswrapper[4777]: I0216 22:31:56.923467 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fnkp\" (UniqueName: \"kubernetes.io/projected/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-kube-api-access-6fnkp\") on node \"crc\" DevicePath \"\"" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.162764 4777 generic.go:334] "Generic (PLEG): container finished" podID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerID="0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf" exitCode=0 Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.162829 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerDied","Data":"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf"} Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.162845 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgct" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.162877 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgct" event={"ID":"38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc","Type":"ContainerDied","Data":"a88142fb5bbcba7865abfa3445490f4323cc11df462e76e19f53ef672763c5f3"} Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.162898 4777 scope.go:117] "RemoveContainer" containerID="0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.181262 4777 scope.go:117] "RemoveContainer" containerID="097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.197946 4777 scope.go:117] "RemoveContainer" containerID="cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.229601 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" (UID: "38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.231899 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.241753 4777 scope.go:117] "RemoveContainer" containerID="0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf" Feb 16 22:31:57 crc kubenswrapper[4777]: E0216 22:31:57.242390 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf\": container with ID starting with 0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf not found: ID does not exist" containerID="0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.242539 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf"} err="failed to get container status \"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf\": rpc error: code = NotFound desc = could not find container \"0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf\": container with ID starting with 0b6dbe8ec0f2c3554e85b9810bde83ccc0a40660ac07416effa2d853788f49cf not found: ID does not exist" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.242627 4777 scope.go:117] "RemoveContainer" containerID="097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7" Feb 16 22:31:57 crc kubenswrapper[4777]: E0216 22:31:57.243110 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7\": container with ID starting with 097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7 not found: ID does not exist" containerID="097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.243186 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7"} err="failed to get container status \"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7\": rpc error: code = NotFound desc = could not find container \"097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7\": container with ID starting with 097d60ff6f73f984e7d843c278ecf163143cb820be9e0e1b542337c970e2c1c7 not found: ID does not exist" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.243258 4777 scope.go:117] "RemoveContainer" containerID="cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d" Feb 16 22:31:57 crc kubenswrapper[4777]: E0216 22:31:57.243775 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d\": container with ID starting with cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d not found: ID does not exist" containerID="cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.243820 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d"} err="failed to get container status \"cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d\": rpc error: code = NotFound desc = could not find container \"cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d\": container with ID starting with cff209a73bbe9e47385e8447cdfa060455dab0e494c6aa323bcb7baaf3a2921d not found: ID does not exist" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.502367 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.517229 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nmgct"] Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.856107 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77969c54dc-b2rtc_5f5e98cb-6704-49a9-93b5-4158dbddbb58/prometheus-operator-admission-webhook/0.log" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.905403 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7nx5h_48b27685-17d6-45ee-a71a-965bac61e90c/prometheus-operator/0.log" Feb 16 22:31:57 crc kubenswrapper[4777]: I0216 22:31:57.926562 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-77969c54dc-jhj2b_c2deed2d-e90d-4062-817f-524d5413831d/prometheus-operator-admission-webhook/0.log" Feb 16 22:31:58 crc kubenswrapper[4777]: I0216 22:31:58.074029 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-4ljm7_995fa122-9f83-4b7c-97ce-4e1cfeb76b29/operator/0.log" Feb 16 22:31:58 crc kubenswrapper[4777]: I0216 22:31:58.120024 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-zktjn_0997c6ee-fff0-48ed-8234-72a4dde5f326/perses-operator/0.log" Feb 16 22:31:58 crc kubenswrapper[4777]: I0216 22:31:58.212927 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" path="/var/lib/kubelet/pods/38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc/volumes" Feb 16 22:32:01 crc kubenswrapper[4777]: I0216 22:32:01.182419 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:32:01 crc kubenswrapper[4777]: E0216 22:32:01.183048 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:32:08 crc kubenswrapper[4777]: E0216 22:32:08.183310 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:32:12 crc kubenswrapper[4777]: I0216 22:32:12.184842 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b545f4d58-l9tqd_94269fb5-9977-4c2c-aa87-1fe5a841dc8d/kube-rbac-proxy/0.log" Feb 16 22:32:12 crc kubenswrapper[4777]: I0216 22:32:12.221702 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5b545f4d58-l9tqd_94269fb5-9977-4c2c-aa87-1fe5a841dc8d/manager/0.log" Feb 16 22:32:15 crc kubenswrapper[4777]: I0216 22:32:15.182097 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:32:15 crc kubenswrapper[4777]: E0216 22:32:15.184477 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:32:19 crc kubenswrapper[4777]: E0216 22:32:19.193967 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:32:28 crc kubenswrapper[4777]: I0216 22:32:28.201176 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:32:28 crc kubenswrapper[4777]: E0216 22:32:28.202156 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:32:34 crc kubenswrapper[4777]: E0216 22:32:34.183231 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:32:40 crc kubenswrapper[4777]: I0216 22:32:40.186889 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:32:40 crc kubenswrapper[4777]: E0216 22:32:40.187782 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h78cj_openshift-machine-config-operator(fbd6cb2a-0e80-4642-ad1e-993774971496)\"" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" Feb 16 22:32:49 crc kubenswrapper[4777]: E0216 22:32:49.185472 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:32:51 crc kubenswrapper[4777]: I0216 22:32:51.182962 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:32:51 crc kubenswrapper[4777]: I0216 22:32:51.717274 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"d71587fbe87214a1c9817e9e16e758b3f2ecb88ba9d6b49af58686c4892a626e"} Feb 16 22:33:03 crc kubenswrapper[4777]: E0216 22:33:03.184401 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:33:15 crc kubenswrapper[4777]: E0216 22:33:15.185528 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:33:30 crc kubenswrapper[4777]: E0216 22:33:30.189129 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:33:38 crc kubenswrapper[4777]: I0216 22:33:38.264699 4777 generic.go:334] "Generic (PLEG): container finished" podID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerID="364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96" exitCode=0 Feb 16 22:33:38 crc kubenswrapper[4777]: I0216 22:33:38.264825 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" event={"ID":"25a628c7-c0f3-4385-a217-4a37c82197b8","Type":"ContainerDied","Data":"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96"} Feb 16 22:33:38 crc kubenswrapper[4777]: I0216 22:33:38.266322 4777 scope.go:117] "RemoveContainer" containerID="364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96" Feb 16 22:33:38 crc kubenswrapper[4777]: I0216 22:33:38.550476 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8tfd5_must-gather-wzvlc_25a628c7-c0f3-4385-a217-4a37c82197b8/gather/0.log" Feb 16 22:33:42 crc kubenswrapper[4777]: E0216 22:33:42.184125 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:33:47 crc kubenswrapper[4777]: I0216 22:33:47.482018 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8tfd5/must-gather-wzvlc"] Feb 16 22:33:47 crc kubenswrapper[4777]: I0216 22:33:47.483267 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="copy" containerID="cri-o://38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911" gracePeriod=2 Feb 16 22:33:47 crc kubenswrapper[4777]: I0216 22:33:47.505923 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8tfd5/must-gather-wzvlc"] Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.068276 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8tfd5_must-gather-wzvlc_25a628c7-c0f3-4385-a217-4a37c82197b8/copy/0.log" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.071951 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.094883 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwgdg\" (UniqueName: \"kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg\") pod \"25a628c7-c0f3-4385-a217-4a37c82197b8\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.095241 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output\") pod \"25a628c7-c0f3-4385-a217-4a37c82197b8\" (UID: \"25a628c7-c0f3-4385-a217-4a37c82197b8\") " Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.106281 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg" (OuterVolumeSpecName: "kube-api-access-kwgdg") pod "25a628c7-c0f3-4385-a217-4a37c82197b8" (UID: "25a628c7-c0f3-4385-a217-4a37c82197b8"). InnerVolumeSpecName "kube-api-access-kwgdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.197235 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwgdg\" (UniqueName: \"kubernetes.io/projected/25a628c7-c0f3-4385-a217-4a37c82197b8-kube-api-access-kwgdg\") on node \"crc\" DevicePath \"\"" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.253052 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "25a628c7-c0f3-4385-a217-4a37c82197b8" (UID: "25a628c7-c0f3-4385-a217-4a37c82197b8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.299060 4777 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/25a628c7-c0f3-4385-a217-4a37c82197b8-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.388596 4777 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8tfd5_must-gather-wzvlc_25a628c7-c0f3-4385-a217-4a37c82197b8/copy/0.log" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.388841 4777 generic.go:334] "Generic (PLEG): container finished" podID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerID="38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911" exitCode=143 Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.389196 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tfd5/must-gather-wzvlc" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.389405 4777 scope.go:117] "RemoveContainer" containerID="38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.423527 4777 scope.go:117] "RemoveContainer" containerID="364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.505228 4777 scope.go:117] "RemoveContainer" containerID="38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911" Feb 16 22:33:48 crc kubenswrapper[4777]: E0216 22:33:48.505603 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911\": container with ID starting with 38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911 not found: ID does not exist" containerID="38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.505640 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911"} err="failed to get container status \"38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911\": rpc error: code = NotFound desc = could not find container \"38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911\": container with ID starting with 38d7c1a29e3298259d62e2e044369c5ab79f3b010ba77cb021daa6b512c6c911 not found: ID does not exist" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.505667 4777 scope.go:117] "RemoveContainer" containerID="364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96" Feb 16 22:33:48 crc kubenswrapper[4777]: E0216 22:33:48.505981 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96\": container with ID starting with 364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96 not found: ID does not exist" containerID="364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96" Feb 16 22:33:48 crc kubenswrapper[4777]: I0216 22:33:48.506010 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96"} err="failed to get container status \"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96\": rpc error: code = NotFound desc = could not find container \"364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96\": container with ID starting with 364779d9b0f4aa4d733eaaa3a8552ab572cc6bdfbd28f1f5a9061dfeb98c2b96 not found: ID does not exist" Feb 16 22:33:50 crc kubenswrapper[4777]: I0216 22:33:50.198846 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" path="/var/lib/kubelet/pods/25a628c7-c0f3-4385-a217-4a37c82197b8/volumes" Feb 16 22:33:53 crc kubenswrapper[4777]: E0216 22:33:53.184113 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:34:05 crc kubenswrapper[4777]: I0216 22:34:05.626311 4777 scope.go:117] "RemoveContainer" containerID="e244ca38ff8ec21d0ec5b9ec931bd4f5dd5a51f1a8f8edbc5b6cb9e954fd5cf1" Feb 16 22:34:06 crc kubenswrapper[4777]: E0216 22:34:06.184170 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.278152 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:10 crc kubenswrapper[4777]: E0216 22:34:10.280549 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="extract-utilities" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.280595 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="extract-utilities" Feb 16 22:34:10 crc kubenswrapper[4777]: E0216 22:34:10.280657 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="copy" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.280670 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="copy" Feb 16 22:34:10 crc kubenswrapper[4777]: E0216 22:34:10.280709 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="extract-content" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.280749 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="extract-content" Feb 16 22:34:10 crc kubenswrapper[4777]: E0216 22:34:10.280770 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="registry-server" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.280784 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="registry-server" Feb 16 22:34:10 crc kubenswrapper[4777]: E0216 22:34:10.280831 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="gather" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.280842 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="gather" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.281347 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="copy" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.281394 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fe03d6-b810-4cd4-8d50-bcfbbfadb8dc" containerName="registry-server" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.281423 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a628c7-c0f3-4385-a217-4a37c82197b8" containerName="gather" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.284161 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.295894 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.334504 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.334647 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.335174 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv54v\" (UniqueName: \"kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.436391 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv54v\" (UniqueName: \"kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.436764 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.436898 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.437302 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.437365 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.463512 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv54v\" (UniqueName: \"kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v\") pod \"redhat-operators-g8b48\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:10 crc kubenswrapper[4777]: I0216 22:34:10.604708 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:11 crc kubenswrapper[4777]: I0216 22:34:11.169892 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:11 crc kubenswrapper[4777]: I0216 22:34:11.654015 4777 generic.go:334] "Generic (PLEG): container finished" podID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerID="48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7" exitCode=0 Feb 16 22:34:11 crc kubenswrapper[4777]: I0216 22:34:11.654081 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerDied","Data":"48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7"} Feb 16 22:34:11 crc kubenswrapper[4777]: I0216 22:34:11.654311 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerStarted","Data":"95f38662b347313ad0655ccd7e196e0f9a323c4b4603a88aa026856441c3c64d"} Feb 16 22:34:11 crc kubenswrapper[4777]: I0216 22:34:11.655942 4777 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 16 22:34:12 crc kubenswrapper[4777]: I0216 22:34:12.674102 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerStarted","Data":"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0"} Feb 16 22:34:13 crc kubenswrapper[4777]: I0216 22:34:13.688533 4777 generic.go:334] "Generic (PLEG): container finished" podID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerID="266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0" exitCode=0 Feb 16 22:34:13 crc kubenswrapper[4777]: I0216 22:34:13.688841 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerDied","Data":"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0"} Feb 16 22:34:15 crc kubenswrapper[4777]: I0216 22:34:15.707345 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerStarted","Data":"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825"} Feb 16 22:34:15 crc kubenswrapper[4777]: I0216 22:34:15.729019 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g8b48" podStartSLOduration=2.866565396 podStartE2EDuration="5.728997779s" podCreationTimestamp="2026-02-16 22:34:10 +0000 UTC" firstStartedPulling="2026-02-16 22:34:11.655759178 +0000 UTC m=+3372.238260280" lastFinishedPulling="2026-02-16 22:34:14.518191561 +0000 UTC m=+3375.100692663" observedRunningTime="2026-02-16 22:34:15.723024422 +0000 UTC m=+3376.305525534" watchObservedRunningTime="2026-02-16 22:34:15.728997779 +0000 UTC m=+3376.311498901" Feb 16 22:34:19 crc kubenswrapper[4777]: E0216 22:34:19.184364 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:34:20 crc kubenswrapper[4777]: I0216 22:34:20.605899 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:20 crc kubenswrapper[4777]: I0216 22:34:20.606272 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:21 crc kubenswrapper[4777]: I0216 22:34:21.682425 4777 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g8b48" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="registry-server" probeResult="failure" output=< Feb 16 22:34:21 crc kubenswrapper[4777]: timeout: failed to connect service ":50051" within 1s Feb 16 22:34:21 crc kubenswrapper[4777]: > Feb 16 22:34:30 crc kubenswrapper[4777]: I0216 22:34:30.684051 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:30 crc kubenswrapper[4777]: I0216 22:34:30.776758 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:30 crc kubenswrapper[4777]: I0216 22:34:30.931466 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:31 crc kubenswrapper[4777]: E0216 22:34:31.184125 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:34:31 crc kubenswrapper[4777]: I0216 22:34:31.852558 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g8b48" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="registry-server" containerID="cri-o://f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825" gracePeriod=2 Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.417492 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.501152 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities\") pod \"87d00c0d-d817-4fc1-b205-7c4a69637159\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.501212 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv54v\" (UniqueName: \"kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v\") pod \"87d00c0d-d817-4fc1-b205-7c4a69637159\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.501373 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content\") pod \"87d00c0d-d817-4fc1-b205-7c4a69637159\" (UID: \"87d00c0d-d817-4fc1-b205-7c4a69637159\") " Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.502617 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities" (OuterVolumeSpecName: "utilities") pod "87d00c0d-d817-4fc1-b205-7c4a69637159" (UID: "87d00c0d-d817-4fc1-b205-7c4a69637159"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.516273 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v" (OuterVolumeSpecName: "kube-api-access-rv54v") pod "87d00c0d-d817-4fc1-b205-7c4a69637159" (UID: "87d00c0d-d817-4fc1-b205-7c4a69637159"). InnerVolumeSpecName "kube-api-access-rv54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.604341 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.604410 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv54v\" (UniqueName: \"kubernetes.io/projected/87d00c0d-d817-4fc1-b205-7c4a69637159-kube-api-access-rv54v\") on node \"crc\" DevicePath \"\"" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.653174 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87d00c0d-d817-4fc1-b205-7c4a69637159" (UID: "87d00c0d-d817-4fc1-b205-7c4a69637159"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.706460 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87d00c0d-d817-4fc1-b205-7c4a69637159-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.868002 4777 generic.go:334] "Generic (PLEG): container finished" podID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerID="f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825" exitCode=0 Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.868066 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerDied","Data":"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825"} Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.868147 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8b48" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.868175 4777 scope.go:117] "RemoveContainer" containerID="f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.868154 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8b48" event={"ID":"87d00c0d-d817-4fc1-b205-7c4a69637159","Type":"ContainerDied","Data":"95f38662b347313ad0655ccd7e196e0f9a323c4b4603a88aa026856441c3c64d"} Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.896570 4777 scope.go:117] "RemoveContainer" containerID="266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0" Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.927608 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.944569 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g8b48"] Feb 16 22:34:32 crc kubenswrapper[4777]: I0216 22:34:32.949831 4777 scope.go:117] "RemoveContainer" containerID="48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.026310 4777 scope.go:117] "RemoveContainer" containerID="f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825" Feb 16 22:34:33 crc kubenswrapper[4777]: E0216 22:34:33.027342 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825\": container with ID starting with f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825 not found: ID does not exist" containerID="f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.027387 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825"} err="failed to get container status \"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825\": rpc error: code = NotFound desc = could not find container \"f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825\": container with ID starting with f3b17350ad003ccb039745272a5e496143642705851bc1947f7c68d19e886825 not found: ID does not exist" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.027438 4777 scope.go:117] "RemoveContainer" containerID="266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0" Feb 16 22:34:33 crc kubenswrapper[4777]: E0216 22:34:33.027957 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0\": container with ID starting with 266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0 not found: ID does not exist" containerID="266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.027991 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0"} err="failed to get container status \"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0\": rpc error: code = NotFound desc = could not find container \"266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0\": container with ID starting with 266dca279358604291dcfce079a4d40d66777427248cb1845216b0a9d6c9f6b0 not found: ID does not exist" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.028009 4777 scope.go:117] "RemoveContainer" containerID="48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7" Feb 16 22:34:33 crc kubenswrapper[4777]: E0216 22:34:33.028242 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7\": container with ID starting with 48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7 not found: ID does not exist" containerID="48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7" Feb 16 22:34:33 crc kubenswrapper[4777]: I0216 22:34:33.028290 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7"} err="failed to get container status \"48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7\": rpc error: code = NotFound desc = could not find container \"48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7\": container with ID starting with 48dfec033a2291da23f4a85b26501b8137026c7b8710816644e752f0cd28ace7 not found: ID does not exist" Feb 16 22:34:34 crc kubenswrapper[4777]: I0216 22:34:34.195497 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" path="/var/lib/kubelet/pods/87d00c0d-d817-4fc1-b205-7c4a69637159/volumes" Feb 16 22:34:46 crc kubenswrapper[4777]: E0216 22:34:46.323267 4777 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:34:46 crc kubenswrapper[4777]: E0216 22:34:46.324360 4777 kuberuntime_image.go:55] "Failed to pull image" err="initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 16 22:34:46 crc kubenswrapper[4777]: E0216 22:34:46.324690 4777 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4g7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4x94c_openstack(5f37d0cb-2453-4e2f-96de-de72db42d690): ErrImagePull: initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine" logger="UnhandledError" Feb 16 22:34:46 crc kubenswrapper[4777]: E0216 22:34:46.326200 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"initializing source docker://quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current: reading manifest current in quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api: unknown: Tag current was deleted or has expired. To pull, revive via time machine\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:34:57 crc kubenswrapper[4777]: E0216 22:34:57.187441 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:35:08 crc kubenswrapper[4777]: E0216 22:35:08.183384 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:35:11 crc kubenswrapper[4777]: I0216 22:35:11.652336 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:35:11 crc kubenswrapper[4777]: I0216 22:35:11.653023 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:35:19 crc kubenswrapper[4777]: E0216 22:35:19.184015 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:35:31 crc kubenswrapper[4777]: E0216 22:35:31.186688 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:35:41 crc kubenswrapper[4777]: I0216 22:35:41.653548 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:35:41 crc kubenswrapper[4777]: I0216 22:35:41.654097 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:35:44 crc kubenswrapper[4777]: E0216 22:35:44.189040 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:35:55 crc kubenswrapper[4777]: E0216 22:35:55.183565 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:36:08 crc kubenswrapper[4777]: E0216 22:36:08.186336 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:36:11 crc kubenswrapper[4777]: I0216 22:36:11.651729 4777 patch_prober.go:28] interesting pod/machine-config-daemon-h78cj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 16 22:36:11 crc kubenswrapper[4777]: I0216 22:36:11.652114 4777 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 16 22:36:11 crc kubenswrapper[4777]: I0216 22:36:11.652173 4777 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" Feb 16 22:36:11 crc kubenswrapper[4777]: I0216 22:36:11.653111 4777 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d71587fbe87214a1c9817e9e16e758b3f2ecb88ba9d6b49af58686c4892a626e"} pod="openshift-machine-config-operator/machine-config-daemon-h78cj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 16 22:36:11 crc kubenswrapper[4777]: I0216 22:36:11.653191 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" podUID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerName="machine-config-daemon" containerID="cri-o://d71587fbe87214a1c9817e9e16e758b3f2ecb88ba9d6b49af58686c4892a626e" gracePeriod=600 Feb 16 22:36:12 crc kubenswrapper[4777]: I0216 22:36:12.006332 4777 generic.go:334] "Generic (PLEG): container finished" podID="fbd6cb2a-0e80-4642-ad1e-993774971496" containerID="d71587fbe87214a1c9817e9e16e758b3f2ecb88ba9d6b49af58686c4892a626e" exitCode=0 Feb 16 22:36:12 crc kubenswrapper[4777]: I0216 22:36:12.006379 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerDied","Data":"d71587fbe87214a1c9817e9e16e758b3f2ecb88ba9d6b49af58686c4892a626e"} Feb 16 22:36:12 crc kubenswrapper[4777]: I0216 22:36:12.006825 4777 scope.go:117] "RemoveContainer" containerID="0be3345207dd6bb67dc684e84b5ff1b962227af9b60649085f2681302d440dd7" Feb 16 22:36:13 crc kubenswrapper[4777]: I0216 22:36:13.022023 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h78cj" event={"ID":"fbd6cb2a-0e80-4642-ad1e-993774971496","Type":"ContainerStarted","Data":"b57d5bf353508cb7ccbd0d9824b09c551088f599ce576c45b509bfbc9b9b6dc9"} Feb 16 22:36:19 crc kubenswrapper[4777]: E0216 22:36:19.184066 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:36:34 crc kubenswrapper[4777]: E0216 22:36:34.185987 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:36:49 crc kubenswrapper[4777]: E0216 22:36:49.185375 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:01 crc kubenswrapper[4777]: E0216 22:37:01.186111 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:14 crc kubenswrapper[4777]: E0216 22:37:14.196116 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.466280 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:20 crc kubenswrapper[4777]: E0216 22:37:20.467237 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="registry-server" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.467250 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="registry-server" Feb 16 22:37:20 crc kubenswrapper[4777]: E0216 22:37:20.467269 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="extract-utilities" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.467275 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="extract-utilities" Feb 16 22:37:20 crc kubenswrapper[4777]: E0216 22:37:20.467309 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="extract-content" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.467315 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="extract-content" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.467524 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d00c0d-d817-4fc1-b205-7c4a69637159" containerName="registry-server" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.468964 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.480384 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxdm\" (UniqueName: \"kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.480552 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.480597 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.485021 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.582799 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.582871 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.582931 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxdm\" (UniqueName: \"kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.583286 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.583364 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.617140 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxdm\" (UniqueName: \"kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm\") pod \"certified-operators-58l8v\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:20 crc kubenswrapper[4777]: I0216 22:37:20.793646 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:21 crc kubenswrapper[4777]: I0216 22:37:21.318080 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:21 crc kubenswrapper[4777]: I0216 22:37:21.856876 4777 generic.go:334] "Generic (PLEG): container finished" podID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerID="ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9" exitCode=0 Feb 16 22:37:21 crc kubenswrapper[4777]: I0216 22:37:21.857145 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerDied","Data":"ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9"} Feb 16 22:37:21 crc kubenswrapper[4777]: I0216 22:37:21.857170 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerStarted","Data":"ac2cac6e313ead214b74a24c41e475c2abd611e78f189c2c8b8a40e0f702fef8"} Feb 16 22:37:22 crc kubenswrapper[4777]: I0216 22:37:22.871578 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerStarted","Data":"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5"} Feb 16 22:37:23 crc kubenswrapper[4777]: I0216 22:37:23.884843 4777 generic.go:334] "Generic (PLEG): container finished" podID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerID="a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5" exitCode=0 Feb 16 22:37:23 crc kubenswrapper[4777]: I0216 22:37:23.884950 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerDied","Data":"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5"} Feb 16 22:37:24 crc kubenswrapper[4777]: I0216 22:37:24.896183 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerStarted","Data":"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb"} Feb 16 22:37:24 crc kubenswrapper[4777]: I0216 22:37:24.936476 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58l8v" podStartSLOduration=2.43650081 podStartE2EDuration="4.936459724s" podCreationTimestamp="2026-02-16 22:37:20 +0000 UTC" firstStartedPulling="2026-02-16 22:37:21.86013249 +0000 UTC m=+3562.442633592" lastFinishedPulling="2026-02-16 22:37:24.360091394 +0000 UTC m=+3564.942592506" observedRunningTime="2026-02-16 22:37:24.927929744 +0000 UTC m=+3565.510430846" watchObservedRunningTime="2026-02-16 22:37:24.936459724 +0000 UTC m=+3565.518960826" Feb 16 22:37:28 crc kubenswrapper[4777]: E0216 22:37:28.186585 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:30 crc kubenswrapper[4777]: I0216 22:37:30.793774 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:30 crc kubenswrapper[4777]: I0216 22:37:30.794258 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:30 crc kubenswrapper[4777]: I0216 22:37:30.886128 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:31 crc kubenswrapper[4777]: I0216 22:37:31.092248 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:31 crc kubenswrapper[4777]: I0216 22:37:31.163110 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.036179 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58l8v" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="registry-server" containerID="cri-o://931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb" gracePeriod=2 Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.695948 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.727909 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vxdm\" (UniqueName: \"kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm\") pod \"9bcfb825-853f-43ea-9894-c3980b215ad3\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.728188 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content\") pod \"9bcfb825-853f-43ea-9894-c3980b215ad3\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.728302 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities\") pod \"9bcfb825-853f-43ea-9894-c3980b215ad3\" (UID: \"9bcfb825-853f-43ea-9894-c3980b215ad3\") " Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.729475 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities" (OuterVolumeSpecName: "utilities") pod "9bcfb825-853f-43ea-9894-c3980b215ad3" (UID: "9bcfb825-853f-43ea-9894-c3980b215ad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.730913 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.738036 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm" (OuterVolumeSpecName: "kube-api-access-4vxdm") pod "9bcfb825-853f-43ea-9894-c3980b215ad3" (UID: "9bcfb825-853f-43ea-9894-c3980b215ad3"). InnerVolumeSpecName "kube-api-access-4vxdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.821857 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bcfb825-853f-43ea-9894-c3980b215ad3" (UID: "9bcfb825-853f-43ea-9894-c3980b215ad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.833401 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vxdm\" (UniqueName: \"kubernetes.io/projected/9bcfb825-853f-43ea-9894-c3980b215ad3-kube-api-access-4vxdm\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:33 crc kubenswrapper[4777]: I0216 22:37:33.833439 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcfb825-853f-43ea-9894-c3980b215ad3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.055652 4777 generic.go:334] "Generic (PLEG): container finished" podID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerID="931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb" exitCode=0 Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.055771 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58l8v" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.055705 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerDied","Data":"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb"} Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.057214 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58l8v" event={"ID":"9bcfb825-853f-43ea-9894-c3980b215ad3","Type":"ContainerDied","Data":"ac2cac6e313ead214b74a24c41e475c2abd611e78f189c2c8b8a40e0f702fef8"} Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.057251 4777 scope.go:117] "RemoveContainer" containerID="931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.092789 4777 scope.go:117] "RemoveContainer" containerID="a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.125856 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.141219 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58l8v"] Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.146912 4777 scope.go:117] "RemoveContainer" containerID="ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.204249 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" path="/var/lib/kubelet/pods/9bcfb825-853f-43ea-9894-c3980b215ad3/volumes" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.208575 4777 scope.go:117] "RemoveContainer" containerID="931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb" Feb 16 22:37:34 crc kubenswrapper[4777]: E0216 22:37:34.209133 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb\": container with ID starting with 931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb not found: ID does not exist" containerID="931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.209175 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb"} err="failed to get container status \"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb\": rpc error: code = NotFound desc = could not find container \"931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb\": container with ID starting with 931fc3878d31c25a0be873e3cbaa3b58d9575a422d45d32b014e9be8d6c46dbb not found: ID does not exist" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.209203 4777 scope.go:117] "RemoveContainer" containerID="a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5" Feb 16 22:37:34 crc kubenswrapper[4777]: E0216 22:37:34.209618 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5\": container with ID starting with a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5 not found: ID does not exist" containerID="a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.209645 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5"} err="failed to get container status \"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5\": rpc error: code = NotFound desc = could not find container \"a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5\": container with ID starting with a9d7c2d00a411cc1778d89148b012f6f8091b62828690d6983f57db7093dd7e5 not found: ID does not exist" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.209668 4777 scope.go:117] "RemoveContainer" containerID="ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9" Feb 16 22:37:34 crc kubenswrapper[4777]: E0216 22:37:34.210118 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9\": container with ID starting with ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9 not found: ID does not exist" containerID="ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9" Feb 16 22:37:34 crc kubenswrapper[4777]: I0216 22:37:34.210169 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9"} err="failed to get container status \"ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9\": rpc error: code = NotFound desc = could not find container \"ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9\": container with ID starting with ad783190b13a7c07900439ab7fc09e51bbda08f78aedca389b7cdd81e6a030d9 not found: ID does not exist" Feb 16 22:37:41 crc kubenswrapper[4777]: E0216 22:37:41.185108 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.579774 4777 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:43 crc kubenswrapper[4777]: E0216 22:37:43.580779 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="extract-utilities" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.580803 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="extract-utilities" Feb 16 22:37:43 crc kubenswrapper[4777]: E0216 22:37:43.580832 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="extract-content" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.580843 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="extract-content" Feb 16 22:37:43 crc kubenswrapper[4777]: E0216 22:37:43.580893 4777 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="registry-server" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.580905 4777 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="registry-server" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.581189 4777 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcfb825-853f-43ea-9894-c3980b215ad3" containerName="registry-server" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.583226 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.590608 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.718628 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.719858 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzwd\" (UniqueName: \"kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.720072 4777 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.822323 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.822465 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.822510 4777 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzwd\" (UniqueName: \"kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.823017 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.823309 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.858352 4777 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzwd\" (UniqueName: \"kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd\") pod \"redhat-marketplace-98tgh\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:43 crc kubenswrapper[4777]: I0216 22:37:43.920448 4777 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:44 crc kubenswrapper[4777]: I0216 22:37:44.509566 4777 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:45 crc kubenswrapper[4777]: I0216 22:37:45.207504 4777 generic.go:334] "Generic (PLEG): container finished" podID="e45e6e2c-7f08-4e25-a2dc-398feed20e0c" containerID="5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e" exitCode=0 Feb 16 22:37:45 crc kubenswrapper[4777]: I0216 22:37:45.207996 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerDied","Data":"5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e"} Feb 16 22:37:45 crc kubenswrapper[4777]: I0216 22:37:45.208037 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerStarted","Data":"b09c6a8aa40c453edfa2eb68b0e221de8ef44e043cd8c6f0e334912b0c8e1617"} Feb 16 22:37:46 crc kubenswrapper[4777]: I0216 22:37:46.227624 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerStarted","Data":"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506"} Feb 16 22:37:47 crc kubenswrapper[4777]: I0216 22:37:47.246090 4777 generic.go:334] "Generic (PLEG): container finished" podID="e45e6e2c-7f08-4e25-a2dc-398feed20e0c" containerID="ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506" exitCode=0 Feb 16 22:37:47 crc kubenswrapper[4777]: I0216 22:37:47.246510 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerDied","Data":"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506"} Feb 16 22:37:48 crc kubenswrapper[4777]: I0216 22:37:48.266217 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerStarted","Data":"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9"} Feb 16 22:37:48 crc kubenswrapper[4777]: I0216 22:37:48.301244 4777 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-98tgh" podStartSLOduration=2.579750313 podStartE2EDuration="5.301218181s" podCreationTimestamp="2026-02-16 22:37:43 +0000 UTC" firstStartedPulling="2026-02-16 22:37:45.210816873 +0000 UTC m=+3585.793317985" lastFinishedPulling="2026-02-16 22:37:47.932284741 +0000 UTC m=+3588.514785853" observedRunningTime="2026-02-16 22:37:48.293390772 +0000 UTC m=+3588.875891914" watchObservedRunningTime="2026-02-16 22:37:48.301218181 +0000 UTC m=+3588.883719303" Feb 16 22:37:53 crc kubenswrapper[4777]: I0216 22:37:53.921510 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:53 crc kubenswrapper[4777]: I0216 22:37:53.923785 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:54 crc kubenswrapper[4777]: I0216 22:37:54.017949 4777 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:54 crc kubenswrapper[4777]: I0216 22:37:54.503270 4777 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:54 crc kubenswrapper[4777]: I0216 22:37:54.569430 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:55 crc kubenswrapper[4777]: E0216 22:37:55.184118 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:37:56 crc kubenswrapper[4777]: I0216 22:37:56.372028 4777 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-98tgh" podUID="e45e6e2c-7f08-4e25-a2dc-398feed20e0c" containerName="registry-server" containerID="cri-o://e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9" gracePeriod=2 Feb 16 22:37:56 crc kubenswrapper[4777]: I0216 22:37:56.907354 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:56.992769 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzwd\" (UniqueName: \"kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd\") pod \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:56.992858 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities\") pod \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:56.992990 4777 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content\") pod \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\" (UID: \"e45e6e2c-7f08-4e25-a2dc-398feed20e0c\") " Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:56.994241 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities" (OuterVolumeSpecName: "utilities") pod "e45e6e2c-7f08-4e25-a2dc-398feed20e0c" (UID: "e45e6e2c-7f08-4e25-a2dc-398feed20e0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.005898 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd" (OuterVolumeSpecName: "kube-api-access-kzzwd") pod "e45e6e2c-7f08-4e25-a2dc-398feed20e0c" (UID: "e45e6e2c-7f08-4e25-a2dc-398feed20e0c"). InnerVolumeSpecName "kube-api-access-kzzwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.020595 4777 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e45e6e2c-7f08-4e25-a2dc-398feed20e0c" (UID: "e45e6e2c-7f08-4e25-a2dc-398feed20e0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.095421 4777 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-utilities\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.095469 4777 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.095487 4777 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzzwd\" (UniqueName: \"kubernetes.io/projected/e45e6e2c-7f08-4e25-a2dc-398feed20e0c-kube-api-access-kzzwd\") on node \"crc\" DevicePath \"\"" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.381648 4777 generic.go:334] "Generic (PLEG): container finished" podID="e45e6e2c-7f08-4e25-a2dc-398feed20e0c" containerID="e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9" exitCode=0 Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.381692 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerDied","Data":"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9"} Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.381733 4777 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tgh" event={"ID":"e45e6e2c-7f08-4e25-a2dc-398feed20e0c","Type":"ContainerDied","Data":"b09c6a8aa40c453edfa2eb68b0e221de8ef44e043cd8c6f0e334912b0c8e1617"} Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.381751 4777 scope.go:117] "RemoveContainer" containerID="e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.381867 4777 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tgh" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.407869 4777 scope.go:117] "RemoveContainer" containerID="ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.447397 4777 scope.go:117] "RemoveContainer" containerID="5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.447868 4777 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.459104 4777 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tgh"] Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.479790 4777 scope.go:117] "RemoveContainer" containerID="e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9" Feb 16 22:37:57 crc kubenswrapper[4777]: E0216 22:37:57.480275 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9\": container with ID starting with e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9 not found: ID does not exist" containerID="e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.480335 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9"} err="failed to get container status \"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9\": rpc error: code = NotFound desc = could not find container \"e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9\": container with ID starting with e9def230e3bcf8e4c071a0f7893ec23ec8e06dfb54590c32bee8408c195028e9 not found: ID does not exist" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.480363 4777 scope.go:117] "RemoveContainer" containerID="ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506" Feb 16 22:37:57 crc kubenswrapper[4777]: E0216 22:37:57.480698 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506\": container with ID starting with ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506 not found: ID does not exist" containerID="ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.480802 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506"} err="failed to get container status \"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506\": rpc error: code = NotFound desc = could not find container \"ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506\": container with ID starting with ce03b270ddba03916be26a85c5da2ef008e48d76e54edd4935d18c3470ce2506 not found: ID does not exist" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.480872 4777 scope.go:117] "RemoveContainer" containerID="5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e" Feb 16 22:37:57 crc kubenswrapper[4777]: E0216 22:37:57.481273 4777 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e\": container with ID starting with 5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e not found: ID does not exist" containerID="5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e" Feb 16 22:37:57 crc kubenswrapper[4777]: I0216 22:37:57.481296 4777 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e"} err="failed to get container status \"5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e\": rpc error: code = NotFound desc = could not find container \"5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e\": container with ID starting with 5c21f565e1e7950a8b72eaa3d9cb1a4ac451d9aff79408fe2ae10b717e960e8e not found: ID does not exist" Feb 16 22:37:58 crc kubenswrapper[4777]: I0216 22:37:58.193251 4777 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45e6e2c-7f08-4e25-a2dc-398feed20e0c" path="/var/lib/kubelet/pods/e45e6e2c-7f08-4e25-a2dc-398feed20e0c/volumes" Feb 16 22:38:10 crc kubenswrapper[4777]: E0216 22:38:10.194617 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:38:22 crc kubenswrapper[4777]: E0216 22:38:22.185773 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" Feb 16 22:38:34 crc kubenswrapper[4777]: E0216 22:38:34.185215 4777 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4x94c" podUID="5f37d0cb-2453-4e2f-96de-de72db42d690" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515144716166024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015144716167017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015144706536016520 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015144706536015470 5ustar corecore